Sep 4 15:47:22.704132 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Sep 4 13:44:59 -00 2025 Sep 4 15:47:22.704152 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:47:22.704162 kernel: Disabled fast string operations Sep 4 15:47:22.704168 kernel: BIOS-provided physical RAM map: Sep 4 15:47:22.704172 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Sep 4 15:47:22.704176 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Sep 4 15:47:22.704183 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Sep 4 15:47:22.704187 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Sep 4 15:47:22.704192 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Sep 4 15:47:22.704196 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Sep 4 15:47:22.704200 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Sep 4 15:47:22.704204 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Sep 4 15:47:22.704208 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Sep 4 15:47:22.704212 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 4 15:47:22.704219 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Sep 4 15:47:22.704224 kernel: NX (Execute Disable) protection: active Sep 4 15:47:22.704228 kernel: APIC: Static calls initialized Sep 4 15:47:22.704233 kernel: SMBIOS 2.7 present. Sep 4 15:47:22.704238 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Sep 4 15:47:22.704243 kernel: DMI: Memory slots populated: 1/128 Sep 4 15:47:22.704248 kernel: vmware: hypercall mode: 0x00 Sep 4 15:47:22.704253 kernel: Hypervisor detected: VMware Sep 4 15:47:22.704258 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Sep 4 15:47:22.704262 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Sep 4 15:47:22.704267 kernel: vmware: using clock offset of 3394333693 ns Sep 4 15:47:22.704272 kernel: tsc: Detected 3408.000 MHz processor Sep 4 15:47:22.704277 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 4 15:47:22.704282 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 4 15:47:22.704287 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Sep 4 15:47:22.704292 kernel: total RAM covered: 3072M Sep 4 15:47:22.704297 kernel: Found optimal setting for mtrr clean up Sep 4 15:47:22.704303 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Sep 4 15:47:22.704308 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Sep 4 15:47:22.704313 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 4 15:47:22.704317 kernel: Using GB pages for direct mapping Sep 4 15:47:22.704322 kernel: ACPI: Early table checksum verification disabled Sep 4 15:47:22.704327 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Sep 4 15:47:22.704332 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Sep 4 15:47:22.704337 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Sep 4 15:47:22.704343 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Sep 4 15:47:22.704349 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 4 15:47:22.704354 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 4 15:47:22.704359 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Sep 4 15:47:22.704364 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Sep 4 15:47:22.704369 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Sep 4 15:47:22.704375 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Sep 4 15:47:22.704380 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Sep 4 15:47:22.704385 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Sep 4 15:47:22.704390 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Sep 4 15:47:22.704395 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Sep 4 15:47:22.704400 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 4 15:47:22.704405 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 4 15:47:22.704410 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Sep 4 15:47:22.704415 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Sep 4 15:47:22.704421 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Sep 4 15:47:22.704426 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Sep 4 15:47:22.704431 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Sep 4 15:47:22.704436 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Sep 4 15:47:22.704441 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 4 15:47:22.704446 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 4 15:47:22.704451 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Sep 4 15:47:22.704456 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Sep 4 15:47:22.704461 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Sep 4 15:47:22.704467 kernel: Zone ranges: Sep 4 15:47:22.704472 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 4 15:47:22.704478 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Sep 4 15:47:22.704483 kernel: Normal empty Sep 4 15:47:22.704487 kernel: Device empty Sep 4 15:47:22.704492 kernel: Movable zone start for each node Sep 4 15:47:22.704660 kernel: Early memory node ranges Sep 4 15:47:22.704666 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Sep 4 15:47:22.704671 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Sep 4 15:47:22.704676 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Sep 4 15:47:22.704684 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Sep 4 15:47:22.704689 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 15:47:22.704694 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Sep 4 15:47:22.704699 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Sep 4 15:47:22.704704 kernel: ACPI: PM-Timer IO Port: 0x1008 Sep 4 15:47:22.704709 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Sep 4 15:47:22.704714 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 4 15:47:22.704719 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 4 15:47:22.704724 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 4 15:47:22.705830 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 4 15:47:22.705837 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 4 15:47:22.705842 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 4 15:47:22.705847 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 4 15:47:22.705852 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 4 15:47:22.705857 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 4 15:47:22.705862 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 4 15:47:22.705867 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 4 15:47:22.705872 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 4 15:47:22.705877 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 4 15:47:22.705883 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 4 15:47:22.705888 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 4 15:47:22.705893 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 4 15:47:22.705898 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Sep 4 15:47:22.705903 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Sep 4 15:47:22.705908 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Sep 4 15:47:22.705913 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Sep 4 15:47:22.705918 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Sep 4 15:47:22.705923 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Sep 4 15:47:22.705929 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Sep 4 15:47:22.705934 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Sep 4 15:47:22.705939 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Sep 4 15:47:22.705944 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Sep 4 15:47:22.705950 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Sep 4 15:47:22.705958 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Sep 4 15:47:22.705964 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Sep 4 15:47:22.705971 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Sep 4 15:47:22.705979 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Sep 4 15:47:22.705991 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Sep 4 15:47:22.706000 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Sep 4 15:47:22.706010 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Sep 4 15:47:22.706015 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Sep 4 15:47:22.706020 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Sep 4 15:47:22.706025 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Sep 4 15:47:22.706030 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Sep 4 15:47:22.706035 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Sep 4 15:47:22.706044 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Sep 4 15:47:22.706051 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Sep 4 15:47:22.706056 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Sep 4 15:47:22.706063 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Sep 4 15:47:22.706068 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Sep 4 15:47:22.706073 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Sep 4 15:47:22.706078 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Sep 4 15:47:22.706084 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Sep 4 15:47:22.706089 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Sep 4 15:47:22.706094 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Sep 4 15:47:22.706099 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Sep 4 15:47:22.706105 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Sep 4 15:47:22.706111 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Sep 4 15:47:22.706116 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Sep 4 15:47:22.706122 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Sep 4 15:47:22.706127 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Sep 4 15:47:22.706132 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Sep 4 15:47:22.706138 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Sep 4 15:47:22.706143 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Sep 4 15:47:22.706148 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Sep 4 15:47:22.706154 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Sep 4 15:47:22.706160 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Sep 4 15:47:22.706166 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Sep 4 15:47:22.706171 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Sep 4 15:47:22.706176 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Sep 4 15:47:22.706181 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Sep 4 15:47:22.706187 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Sep 4 15:47:22.706192 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Sep 4 15:47:22.706198 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Sep 4 15:47:22.706203 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Sep 4 15:47:22.706208 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Sep 4 15:47:22.706214 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Sep 4 15:47:22.706220 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Sep 4 15:47:22.706225 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Sep 4 15:47:22.706230 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Sep 4 15:47:22.706236 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Sep 4 15:47:22.706241 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Sep 4 15:47:22.706246 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Sep 4 15:47:22.706252 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Sep 4 15:47:22.706257 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Sep 4 15:47:22.706263 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Sep 4 15:47:22.706269 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Sep 4 15:47:22.706274 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Sep 4 15:47:22.706279 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Sep 4 15:47:22.706285 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Sep 4 15:47:22.706290 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Sep 4 15:47:22.706295 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Sep 4 15:47:22.706300 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Sep 4 15:47:22.706305 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Sep 4 15:47:22.706311 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Sep 4 15:47:22.706316 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Sep 4 15:47:22.706323 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Sep 4 15:47:22.706328 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Sep 4 15:47:22.706333 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Sep 4 15:47:22.706339 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Sep 4 15:47:22.706344 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Sep 4 15:47:22.706349 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Sep 4 15:47:22.706354 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Sep 4 15:47:22.706360 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Sep 4 15:47:22.706365 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Sep 4 15:47:22.706370 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Sep 4 15:47:22.706376 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Sep 4 15:47:22.706382 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Sep 4 15:47:22.706387 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Sep 4 15:47:22.706392 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Sep 4 15:47:22.706398 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Sep 4 15:47:22.706403 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Sep 4 15:47:22.706408 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Sep 4 15:47:22.706414 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Sep 4 15:47:22.706419 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Sep 4 15:47:22.706425 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Sep 4 15:47:22.706430 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Sep 4 15:47:22.706436 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Sep 4 15:47:22.706441 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Sep 4 15:47:22.706446 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Sep 4 15:47:22.706451 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Sep 4 15:47:22.706457 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Sep 4 15:47:22.706462 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Sep 4 15:47:22.706467 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Sep 4 15:47:22.706473 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Sep 4 15:47:22.706479 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Sep 4 15:47:22.706484 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Sep 4 15:47:22.706490 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Sep 4 15:47:22.706506 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Sep 4 15:47:22.706513 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Sep 4 15:47:22.706518 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Sep 4 15:47:22.706524 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Sep 4 15:47:22.706529 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Sep 4 15:47:22.706534 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Sep 4 15:47:22.706540 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Sep 4 15:47:22.706548 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 4 15:47:22.706553 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Sep 4 15:47:22.706559 kernel: TSC deadline timer available Sep 4 15:47:22.706564 kernel: CPU topo: Max. logical packages: 128 Sep 4 15:47:22.706570 kernel: CPU topo: Max. logical dies: 128 Sep 4 15:47:22.706575 kernel: CPU topo: Max. dies per package: 1 Sep 4 15:47:22.706580 kernel: CPU topo: Max. threads per core: 1 Sep 4 15:47:22.706586 kernel: CPU topo: Num. cores per package: 1 Sep 4 15:47:22.706591 kernel: CPU topo: Num. threads per package: 1 Sep 4 15:47:22.706598 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Sep 4 15:47:22.706603 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Sep 4 15:47:22.706609 kernel: Booting paravirtualized kernel on VMware hypervisor Sep 4 15:47:22.706614 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 4 15:47:22.706620 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Sep 4 15:47:22.706625 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 4 15:47:22.706631 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 4 15:47:22.706636 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Sep 4 15:47:22.706642 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Sep 4 15:47:22.706647 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Sep 4 15:47:22.706654 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Sep 4 15:47:22.706659 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Sep 4 15:47:22.706664 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Sep 4 15:47:22.706670 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Sep 4 15:47:22.706675 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Sep 4 15:47:22.706681 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Sep 4 15:47:22.706686 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Sep 4 15:47:22.706691 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Sep 4 15:47:22.706698 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Sep 4 15:47:22.706703 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Sep 4 15:47:22.706709 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Sep 4 15:47:22.706714 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Sep 4 15:47:22.706719 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Sep 4 15:47:22.706725 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:47:22.706731 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 4 15:47:22.706737 kernel: random: crng init done Sep 4 15:47:22.706747 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Sep 4 15:47:22.706753 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Sep 4 15:47:22.706758 kernel: printk: log_buf_len min size: 262144 bytes Sep 4 15:47:22.706764 kernel: printk: log_buf_len: 1048576 bytes Sep 4 15:47:22.706772 kernel: printk: early log buf free: 245592(93%) Sep 4 15:47:22.706778 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 15:47:22.706784 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 4 15:47:22.706789 kernel: Fallback order for Node 0: 0 Sep 4 15:47:22.706795 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Sep 4 15:47:22.706804 kernel: Policy zone: DMA32 Sep 4 15:47:22.706810 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 4 15:47:22.706816 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Sep 4 15:47:22.706821 kernel: ftrace: allocating 40102 entries in 157 pages Sep 4 15:47:22.706826 kernel: ftrace: allocated 157 pages with 5 groups Sep 4 15:47:22.706835 kernel: Dynamic Preempt: voluntary Sep 4 15:47:22.706841 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 4 15:47:22.706847 kernel: rcu: RCU event tracing is enabled. Sep 4 15:47:22.706852 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Sep 4 15:47:22.706859 kernel: Trampoline variant of Tasks RCU enabled. Sep 4 15:47:22.706868 kernel: Rude variant of Tasks RCU enabled. Sep 4 15:47:22.706873 kernel: Tracing variant of Tasks RCU enabled. Sep 4 15:47:22.706878 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 4 15:47:22.706884 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Sep 4 15:47:22.706890 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 4 15:47:22.706895 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 4 15:47:22.706901 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 4 15:47:22.706906 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Sep 4 15:47:22.706912 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Sep 4 15:47:22.706918 kernel: Console: colour VGA+ 80x25 Sep 4 15:47:22.706924 kernel: printk: legacy console [tty0] enabled Sep 4 15:47:22.706929 kernel: printk: legacy console [ttyS0] enabled Sep 4 15:47:22.706938 kernel: ACPI: Core revision 20240827 Sep 4 15:47:22.706944 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Sep 4 15:47:22.706949 kernel: APIC: Switch to symmetric I/O mode setup Sep 4 15:47:22.706955 kernel: x2apic enabled Sep 4 15:47:22.706960 kernel: APIC: Switched APIC routing to: physical x2apic Sep 4 15:47:22.706965 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 4 15:47:22.706972 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 4 15:47:22.706978 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Sep 4 15:47:22.706983 kernel: Disabled fast string operations Sep 4 15:47:22.706989 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 4 15:47:22.706994 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 4 15:47:22.706999 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 4 15:47:22.707005 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 4 15:47:22.707011 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 4 15:47:22.707016 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 4 15:47:22.707023 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 4 15:47:22.707029 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 4 15:47:22.707034 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 4 15:47:22.707039 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 4 15:47:22.707053 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 4 15:47:22.707064 kernel: GDS: Unknown: Dependent on hypervisor status Sep 4 15:47:22.707078 kernel: active return thunk: its_return_thunk Sep 4 15:47:22.707086 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 4 15:47:22.707091 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 4 15:47:22.707099 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 4 15:47:22.707105 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 4 15:47:22.707110 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 4 15:47:22.707116 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 4 15:47:22.707121 kernel: Freeing SMP alternatives memory: 32K Sep 4 15:47:22.707127 kernel: pid_max: default: 131072 minimum: 1024 Sep 4 15:47:22.707136 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 4 15:47:22.707142 kernel: landlock: Up and running. Sep 4 15:47:22.707148 kernel: SELinux: Initializing. Sep 4 15:47:22.707155 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 4 15:47:22.707161 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 4 15:47:22.707166 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 4 15:47:22.707172 kernel: Performance Events: Skylake events, core PMU driver. Sep 4 15:47:22.707177 kernel: core: CPUID marked event: 'cpu cycles' unavailable Sep 4 15:47:22.707183 kernel: core: CPUID marked event: 'instructions' unavailable Sep 4 15:47:22.707188 kernel: core: CPUID marked event: 'bus cycles' unavailable Sep 4 15:47:22.707194 kernel: core: CPUID marked event: 'cache references' unavailable Sep 4 15:47:22.707200 kernel: core: CPUID marked event: 'cache misses' unavailable Sep 4 15:47:22.707206 kernel: core: CPUID marked event: 'branch instructions' unavailable Sep 4 15:47:22.707211 kernel: core: CPUID marked event: 'branch misses' unavailable Sep 4 15:47:22.707216 kernel: ... version: 1 Sep 4 15:47:22.707222 kernel: ... bit width: 48 Sep 4 15:47:22.707227 kernel: ... generic registers: 4 Sep 4 15:47:22.707233 kernel: ... value mask: 0000ffffffffffff Sep 4 15:47:22.707238 kernel: ... max period: 000000007fffffff Sep 4 15:47:22.707243 kernel: ... fixed-purpose events: 0 Sep 4 15:47:22.707250 kernel: ... event mask: 000000000000000f Sep 4 15:47:22.707255 kernel: signal: max sigframe size: 1776 Sep 4 15:47:22.707261 kernel: rcu: Hierarchical SRCU implementation. Sep 4 15:47:22.707266 kernel: rcu: Max phase no-delay instances is 400. Sep 4 15:47:22.707272 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Sep 4 15:47:22.707277 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 4 15:47:22.707283 kernel: smp: Bringing up secondary CPUs ... Sep 4 15:47:22.707288 kernel: smpboot: x86: Booting SMP configuration: Sep 4 15:47:22.707294 kernel: .... node #0, CPUs: #1 Sep 4 15:47:22.707300 kernel: Disabled fast string operations Sep 4 15:47:22.707305 kernel: smp: Brought up 1 node, 2 CPUs Sep 4 15:47:22.707311 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Sep 4 15:47:22.707316 kernel: Memory: 1924244K/2096628K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 161008K reserved, 0K cma-reserved) Sep 4 15:47:22.707322 kernel: devtmpfs: initialized Sep 4 15:47:22.707327 kernel: x86/mm: Memory block size: 128MB Sep 4 15:47:22.707333 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Sep 4 15:47:22.707338 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 4 15:47:22.707344 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Sep 4 15:47:22.707350 kernel: pinctrl core: initialized pinctrl subsystem Sep 4 15:47:22.707356 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 4 15:47:22.707361 kernel: audit: initializing netlink subsys (disabled) Sep 4 15:47:22.707367 kernel: audit: type=2000 audit(1757000839.273:1): state=initialized audit_enabled=0 res=1 Sep 4 15:47:22.707372 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 4 15:47:22.707378 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 4 15:47:22.707383 kernel: cpuidle: using governor menu Sep 4 15:47:22.707389 kernel: Simple Boot Flag at 0x36 set to 0x80 Sep 4 15:47:22.707394 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 4 15:47:22.707400 kernel: dca service started, version 1.12.1 Sep 4 15:47:22.707413 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Sep 4 15:47:22.707419 kernel: PCI: Using configuration type 1 for base access Sep 4 15:47:22.707425 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 4 15:47:22.707431 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 4 15:47:22.707437 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 4 15:47:22.707442 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 4 15:47:22.707448 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 4 15:47:22.707454 kernel: ACPI: Added _OSI(Module Device) Sep 4 15:47:22.707461 kernel: ACPI: Added _OSI(Processor Device) Sep 4 15:47:22.707466 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 4 15:47:22.707472 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 4 15:47:22.707477 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 4 15:47:22.707483 kernel: ACPI: Interpreter enabled Sep 4 15:47:22.707489 kernel: ACPI: PM: (supports S0 S1 S5) Sep 4 15:47:22.707700 kernel: ACPI: Using IOAPIC for interrupt routing Sep 4 15:47:22.707709 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 4 15:47:22.707715 kernel: PCI: Using E820 reservations for host bridge windows Sep 4 15:47:22.707724 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Sep 4 15:47:22.707734 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Sep 4 15:47:22.707856 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 4 15:47:22.707911 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Sep 4 15:47:22.707961 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Sep 4 15:47:22.707970 kernel: PCI host bridge to bus 0000:00 Sep 4 15:47:22.708021 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 4 15:47:22.708070 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Sep 4 15:47:22.708114 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 4 15:47:22.708157 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 4 15:47:22.708200 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Sep 4 15:47:22.708243 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Sep 4 15:47:22.708306 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Sep 4 15:47:22.708368 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Sep 4 15:47:22.708420 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 15:47:22.708490 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 4 15:47:22.708588 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Sep 4 15:47:22.708646 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Sep 4 15:47:22.708699 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Sep 4 15:47:22.708759 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Sep 4 15:47:22.708810 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Sep 4 15:47:22.708860 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Sep 4 15:47:22.708914 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 4 15:47:22.708965 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Sep 4 15:47:22.709019 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Sep 4 15:47:22.709074 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Sep 4 15:47:22.709124 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Sep 4 15:47:22.709174 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Sep 4 15:47:22.709229 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Sep 4 15:47:22.709279 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Sep 4 15:47:22.709331 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Sep 4 15:47:22.709381 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Sep 4 15:47:22.709430 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Sep 4 15:47:22.709479 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 4 15:47:22.710301 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Sep 4 15:47:22.710356 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 4 15:47:22.710408 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 4 15:47:22.710476 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 4 15:47:22.710536 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 4 15:47:22.710592 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.710643 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 4 15:47:22.710694 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 4 15:47:22.710757 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 4 15:47:22.710813 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.710872 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.710923 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 4 15:47:22.710974 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 4 15:47:22.711025 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 4 15:47:22.711076 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 4 15:47:22.711126 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.711184 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.711238 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 4 15:47:22.711289 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 4 15:47:22.711339 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 4 15:47:22.711390 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 4 15:47:22.711440 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.711501 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.711558 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 4 15:47:22.711610 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 4 15:47:22.711660 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 4 15:47:22.711711 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.711765 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.711815 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 4 15:47:22.711866 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 4 15:47:22.711918 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 4 15:47:22.711969 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.712023 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.712073 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 4 15:47:22.712123 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 4 15:47:22.712188 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 4 15:47:22.712242 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.712300 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.712354 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 4 15:47:22.712404 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 4 15:47:22.712455 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 4 15:47:22.714542 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.714622 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.714679 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 4 15:47:22.714736 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 4 15:47:22.714788 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 4 15:47:22.714838 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.714894 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.714946 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 4 15:47:22.714996 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 4 15:47:22.715047 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 4 15:47:22.715097 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.715155 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.715206 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 4 15:47:22.715256 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 4 15:47:22.715305 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 4 15:47:22.715355 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 4 15:47:22.715405 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.715460 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.716614 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 4 15:47:22.716676 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 4 15:47:22.716729 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 4 15:47:22.716789 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 4 15:47:22.716839 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.716895 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.716947 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 4 15:47:22.717000 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 4 15:47:22.717049 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 4 15:47:22.717101 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.717156 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.717212 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 4 15:47:22.717274 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 4 15:47:22.717325 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 4 15:47:22.717388 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.717449 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.718553 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 4 15:47:22.718629 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 4 15:47:22.718690 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 4 15:47:22.718742 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.718798 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.718854 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 4 15:47:22.718903 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 4 15:47:22.718954 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 4 15:47:22.719003 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.719058 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.719109 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 4 15:47:22.719160 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 4 15:47:22.719213 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 4 15:47:22.719263 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.719316 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.719367 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 4 15:47:22.719417 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 4 15:47:22.719466 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 4 15:47:22.720198 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 4 15:47:22.720256 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.720319 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.720373 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 4 15:47:22.720424 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 4 15:47:22.720477 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 4 15:47:22.720541 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 4 15:47:22.720593 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.720648 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.720699 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 4 15:47:22.720762 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 4 15:47:22.720814 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 4 15:47:22.720868 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 4 15:47:22.720917 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.720972 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.721022 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 4 15:47:22.721072 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 4 15:47:22.721121 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 4 15:47:22.721170 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.721227 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.721278 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 4 15:47:22.721346 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 4 15:47:22.721397 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 4 15:47:22.721448 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.721593 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.721649 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 4 15:47:22.721702 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 4 15:47:22.721751 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 4 15:47:22.721801 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.721869 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.721921 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 4 15:47:22.721970 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 4 15:47:22.722020 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 4 15:47:22.722069 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.722127 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.722177 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 4 15:47:22.722227 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 4 15:47:22.722276 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 4 15:47:22.722325 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.722389 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.722448 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 4 15:47:22.722514 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 4 15:47:22.722571 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 4 15:47:22.722620 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 4 15:47:22.722670 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.722726 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.722777 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 4 15:47:22.722827 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 4 15:47:22.722880 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 4 15:47:22.722930 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 4 15:47:22.722979 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.723036 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.723087 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 4 15:47:22.723138 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 4 15:47:22.723187 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 4 15:47:22.723239 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.723294 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.723345 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 4 15:47:22.723395 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 4 15:47:22.723445 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 4 15:47:22.723513 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.723572 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.723626 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 4 15:47:22.723676 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 4 15:47:22.723726 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 4 15:47:22.723781 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.723834 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.723885 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 4 15:47:22.723934 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 4 15:47:22.723987 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 4 15:47:22.724038 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.724094 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.724145 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 4 15:47:22.724195 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 4 15:47:22.724244 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 4 15:47:22.724293 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.724347 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:47:22.724400 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 4 15:47:22.724459 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 4 15:47:22.724555 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 4 15:47:22.724621 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.724695 kernel: pci_bus 0000:01: extended config space not accessible Sep 4 15:47:22.724764 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 15:47:22.724828 kernel: pci_bus 0000:02: extended config space not accessible Sep 4 15:47:22.724841 kernel: acpiphp: Slot [32] registered Sep 4 15:47:22.724850 kernel: acpiphp: Slot [33] registered Sep 4 15:47:22.724856 kernel: acpiphp: Slot [34] registered Sep 4 15:47:22.724862 kernel: acpiphp: Slot [35] registered Sep 4 15:47:22.724868 kernel: acpiphp: Slot [36] registered Sep 4 15:47:22.724879 kernel: acpiphp: Slot [37] registered Sep 4 15:47:22.724886 kernel: acpiphp: Slot [38] registered Sep 4 15:47:22.724891 kernel: acpiphp: Slot [39] registered Sep 4 15:47:22.724897 kernel: acpiphp: Slot [40] registered Sep 4 15:47:22.724910 kernel: acpiphp: Slot [41] registered Sep 4 15:47:22.724920 kernel: acpiphp: Slot [42] registered Sep 4 15:47:22.724930 kernel: acpiphp: Slot [43] registered Sep 4 15:47:22.724944 kernel: acpiphp: Slot [44] registered Sep 4 15:47:22.724951 kernel: acpiphp: Slot [45] registered Sep 4 15:47:22.724956 kernel: acpiphp: Slot [46] registered Sep 4 15:47:22.724962 kernel: acpiphp: Slot [47] registered Sep 4 15:47:22.724968 kernel: acpiphp: Slot [48] registered Sep 4 15:47:22.724977 kernel: acpiphp: Slot [49] registered Sep 4 15:47:22.724985 kernel: acpiphp: Slot [50] registered Sep 4 15:47:22.724991 kernel: acpiphp: Slot [51] registered Sep 4 15:47:22.724997 kernel: acpiphp: Slot [52] registered Sep 4 15:47:22.725002 kernel: acpiphp: Slot [53] registered Sep 4 15:47:22.725008 kernel: acpiphp: Slot [54] registered Sep 4 15:47:22.725014 kernel: acpiphp: Slot [55] registered Sep 4 15:47:22.725020 kernel: acpiphp: Slot [56] registered Sep 4 15:47:22.725026 kernel: acpiphp: Slot [57] registered Sep 4 15:47:22.725031 kernel: acpiphp: Slot [58] registered Sep 4 15:47:22.725037 kernel: acpiphp: Slot [59] registered Sep 4 15:47:22.725044 kernel: acpiphp: Slot [60] registered Sep 4 15:47:22.725050 kernel: acpiphp: Slot [61] registered Sep 4 15:47:22.725056 kernel: acpiphp: Slot [62] registered Sep 4 15:47:22.725062 kernel: acpiphp: Slot [63] registered Sep 4 15:47:22.725118 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 4 15:47:22.725169 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Sep 4 15:47:22.725218 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Sep 4 15:47:22.725268 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Sep 4 15:47:22.725320 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Sep 4 15:47:22.725369 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Sep 4 15:47:22.725428 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Sep 4 15:47:22.725481 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Sep 4 15:47:22.726240 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Sep 4 15:47:22.726301 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 4 15:47:22.726356 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 4 15:47:22.726413 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 4 15:47:22.726467 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 4 15:47:22.726800 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 4 15:47:22.726871 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 4 15:47:22.726926 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 4 15:47:22.726980 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 4 15:47:22.727032 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 4 15:47:22.727086 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 4 15:47:22.727142 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 4 15:47:22.727200 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Sep 4 15:47:22.727252 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Sep 4 15:47:22.727303 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Sep 4 15:47:22.727354 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Sep 4 15:47:22.727405 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Sep 4 15:47:22.727469 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 4 15:47:22.727535 kernel: pci 0000:0b:00.0: supports D1 D2 Sep 4 15:47:22.727587 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 4 15:47:22.727639 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 4 15:47:22.727691 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 4 15:47:22.727744 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 4 15:47:22.727797 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 4 15:47:22.727849 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 4 15:47:22.727903 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 4 15:47:22.727956 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 4 15:47:22.728007 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 4 15:47:22.728060 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 4 15:47:22.728112 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 4 15:47:22.728165 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 4 15:47:22.728216 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 4 15:47:22.728268 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 4 15:47:22.728323 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 4 15:47:22.728375 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 4 15:47:22.728428 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 4 15:47:22.728479 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 4 15:47:22.728553 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 4 15:47:22.728607 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 4 15:47:22.728659 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 4 15:47:22.728713 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 4 15:47:22.728765 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 4 15:47:22.728815 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 4 15:47:22.728867 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 4 15:47:22.728917 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 4 15:47:22.728926 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Sep 4 15:47:22.728933 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Sep 4 15:47:22.728941 kernel: ACPI: PCI: Interrupt link LNKB disabled Sep 4 15:47:22.728947 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 4 15:47:22.728952 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Sep 4 15:47:22.728958 kernel: iommu: Default domain type: Translated Sep 4 15:47:22.728964 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 4 15:47:22.728970 kernel: PCI: Using ACPI for IRQ routing Sep 4 15:47:22.728976 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 4 15:47:22.728982 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Sep 4 15:47:22.728988 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Sep 4 15:47:22.729039 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Sep 4 15:47:22.729093 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Sep 4 15:47:22.729144 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 4 15:47:22.729153 kernel: vgaarb: loaded Sep 4 15:47:22.729159 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Sep 4 15:47:22.729165 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Sep 4 15:47:22.729171 kernel: clocksource: Switched to clocksource tsc-early Sep 4 15:47:22.729177 kernel: VFS: Disk quotas dquot_6.6.0 Sep 4 15:47:22.729183 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 4 15:47:22.729190 kernel: pnp: PnP ACPI init Sep 4 15:47:22.729245 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Sep 4 15:47:22.729293 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Sep 4 15:47:22.729339 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Sep 4 15:47:22.729390 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Sep 4 15:47:22.729440 kernel: pnp 00:06: [dma 2] Sep 4 15:47:22.729489 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Sep 4 15:47:22.729562 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Sep 4 15:47:22.729608 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Sep 4 15:47:22.729617 kernel: pnp: PnP ACPI: found 8 devices Sep 4 15:47:22.729624 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 4 15:47:22.729630 kernel: NET: Registered PF_INET protocol family Sep 4 15:47:22.729636 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 4 15:47:22.729642 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 4 15:47:22.729648 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 4 15:47:22.729656 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 4 15:47:22.729662 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 4 15:47:22.729668 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 4 15:47:22.729674 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 4 15:47:22.729679 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 4 15:47:22.729685 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 4 15:47:22.729691 kernel: NET: Registered PF_XDP protocol family Sep 4 15:47:22.729748 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Sep 4 15:47:22.729803 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 4 15:47:22.729856 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 4 15:47:22.729909 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 4 15:47:22.729961 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 4 15:47:22.730011 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Sep 4 15:47:22.730063 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Sep 4 15:47:22.730115 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Sep 4 15:47:22.730166 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Sep 4 15:47:22.730220 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Sep 4 15:47:22.730272 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Sep 4 15:47:22.730322 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Sep 4 15:47:22.730376 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Sep 4 15:47:22.730427 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Sep 4 15:47:22.730478 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Sep 4 15:47:22.730536 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Sep 4 15:47:22.730588 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Sep 4 15:47:22.730643 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Sep 4 15:47:22.730694 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Sep 4 15:47:22.730756 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Sep 4 15:47:22.730810 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Sep 4 15:47:22.730862 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Sep 4 15:47:22.730912 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Sep 4 15:47:22.730963 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Sep 4 15:47:22.731014 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Sep 4 15:47:22.731069 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.731119 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.731169 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.731219 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.731269 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.731318 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.731367 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.731419 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.731469 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.731545 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.731597 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.731661 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.731717 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.731916 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.732111 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.732168 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.732603 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.732664 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.732720 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.732774 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.732829 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.732883 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.732936 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.732993 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733047 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733099 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733152 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733205 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733259 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733313 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733368 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733435 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733490 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733558 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733613 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733665 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733719 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733771 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733825 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733880 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.733933 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.733986 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.734037 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.734089 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.734140 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.734191 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.734243 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.734297 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.734348 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.734401 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.734453 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.734514 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.735147 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.735213 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.735272 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.735325 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.735375 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.735429 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.735479 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.735545 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.735598 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.735648 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.735698 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.735753 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.735803 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.735854 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.735907 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.735958 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.736010 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.736060 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.736112 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.736164 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.736215 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.736265 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.736320 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.736370 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.736431 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.736487 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.737590 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.737651 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.737706 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.737776 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.737838 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:47:22.737896 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:47:22.737949 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 15:47:22.738001 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Sep 4 15:47:22.738051 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 4 15:47:22.738100 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 4 15:47:22.738150 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 4 15:47:22.738209 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Sep 4 15:47:22.738265 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 4 15:47:22.738315 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 4 15:47:22.738365 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 4 15:47:22.738414 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Sep 4 15:47:22.738466 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 4 15:47:22.738526 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 4 15:47:22.738591 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 4 15:47:22.738643 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 4 15:47:22.738695 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 4 15:47:22.738749 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 4 15:47:22.738803 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 4 15:47:22.738853 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 4 15:47:22.738904 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 4 15:47:22.738954 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 4 15:47:22.739004 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 4 15:47:22.739054 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 4 15:47:22.739104 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 4 15:47:22.739155 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 4 15:47:22.739208 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 4 15:47:22.739258 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 4 15:47:22.739308 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 4 15:47:22.739358 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 4 15:47:22.739408 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 4 15:47:22.739459 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 4 15:47:22.741828 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 4 15:47:22.741890 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 4 15:47:22.741942 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 4 15:47:22.741998 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Sep 4 15:47:22.742051 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 4 15:47:22.742102 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 4 15:47:22.742151 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 4 15:47:22.742201 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Sep 4 15:47:22.742252 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 4 15:47:22.742305 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 4 15:47:22.742356 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 4 15:47:22.742406 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 4 15:47:22.742458 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 4 15:47:22.742523 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 4 15:47:22.742576 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 4 15:47:22.742627 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 4 15:47:22.742678 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 4 15:47:22.742728 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 4 15:47:22.742778 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 4 15:47:22.742848 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 4 15:47:22.742899 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 4 15:47:22.742949 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 4 15:47:22.743000 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 4 15:47:22.743049 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 4 15:47:22.743099 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 4 15:47:22.743153 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 4 15:47:22.743203 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 4 15:47:22.743252 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 4 15:47:22.743304 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 4 15:47:22.743354 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 4 15:47:22.743406 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 4 15:47:22.743457 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 4 15:47:22.743517 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 4 15:47:22.743573 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 4 15:47:22.743623 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 4 15:47:22.743674 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 4 15:47:22.743724 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 4 15:47:22.743774 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 4 15:47:22.743824 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 4 15:47:22.743876 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 4 15:47:22.743926 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 4 15:47:22.743975 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 4 15:47:22.744025 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 4 15:47:22.744078 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 4 15:47:22.744127 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 4 15:47:22.744177 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 4 15:47:22.744228 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 4 15:47:22.744278 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 4 15:47:22.744327 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 4 15:47:22.744378 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 4 15:47:22.744428 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 4 15:47:22.744479 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 4 15:47:22.744543 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 4 15:47:22.744594 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 4 15:47:22.744644 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 4 15:47:22.744696 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 4 15:47:22.744750 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 4 15:47:22.744800 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 4 15:47:22.744854 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 4 15:47:22.744904 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 4 15:47:22.744956 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 4 15:47:22.745016 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 4 15:47:22.745068 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 4 15:47:22.745117 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 4 15:47:22.745167 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 4 15:47:22.745218 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 4 15:47:22.745268 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 4 15:47:22.745321 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 4 15:47:22.745370 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 4 15:47:22.745422 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 4 15:47:22.745472 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 4 15:47:22.745532 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 4 15:47:22.745591 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 4 15:47:22.745646 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 4 15:47:22.745696 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 4 15:47:22.745752 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 4 15:47:22.745802 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 4 15:47:22.745852 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 4 15:47:22.745904 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 4 15:47:22.745954 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 4 15:47:22.746004 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 4 15:47:22.746058 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 4 15:47:22.746108 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 4 15:47:22.746158 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 4 15:47:22.746209 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Sep 4 15:47:22.746254 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 4 15:47:22.746297 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 4 15:47:22.746341 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Sep 4 15:47:22.746384 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Sep 4 15:47:22.746443 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Sep 4 15:47:22.746515 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Sep 4 15:47:22.746565 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 4 15:47:22.746611 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Sep 4 15:47:22.746656 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 4 15:47:22.746701 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 4 15:47:22.746746 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Sep 4 15:47:22.746794 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Sep 4 15:47:22.746845 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Sep 4 15:47:22.746891 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Sep 4 15:47:22.746937 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Sep 4 15:47:22.746989 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Sep 4 15:47:22.747036 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Sep 4 15:47:22.747081 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Sep 4 15:47:22.747144 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Sep 4 15:47:22.747190 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Sep 4 15:47:22.747235 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Sep 4 15:47:22.747284 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Sep 4 15:47:22.747329 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Sep 4 15:47:22.747378 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Sep 4 15:47:22.747427 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 4 15:47:22.747479 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Sep 4 15:47:22.747556 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Sep 4 15:47:22.747607 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Sep 4 15:47:22.747653 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Sep 4 15:47:22.747702 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Sep 4 15:47:22.747753 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Sep 4 15:47:22.747806 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Sep 4 15:47:22.747851 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Sep 4 15:47:22.747896 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Sep 4 15:47:22.747945 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Sep 4 15:47:22.747991 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Sep 4 15:47:22.748039 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Sep 4 15:47:22.748090 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Sep 4 15:47:22.748135 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Sep 4 15:47:22.748180 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Sep 4 15:47:22.748230 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Sep 4 15:47:22.748275 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 4 15:47:22.748325 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Sep 4 15:47:22.748373 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 4 15:47:22.748422 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Sep 4 15:47:22.748469 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Sep 4 15:47:22.748535 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Sep 4 15:47:22.748583 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Sep 4 15:47:22.748633 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Sep 4 15:47:22.748682 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 4 15:47:22.748734 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Sep 4 15:47:22.748781 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Sep 4 15:47:22.748826 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 4 15:47:22.748876 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Sep 4 15:47:22.750522 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Sep 4 15:47:22.750582 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Sep 4 15:47:22.750639 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Sep 4 15:47:22.750687 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Sep 4 15:47:22.750739 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Sep 4 15:47:22.750799 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Sep 4 15:47:22.750846 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 4 15:47:22.750898 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Sep 4 15:47:22.750948 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 4 15:47:22.751000 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Sep 4 15:47:22.751045 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Sep 4 15:47:22.751097 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Sep 4 15:47:22.751145 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Sep 4 15:47:22.751194 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Sep 4 15:47:22.751242 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 4 15:47:22.751293 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Sep 4 15:47:22.751340 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Sep 4 15:47:22.751385 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Sep 4 15:47:22.751438 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Sep 4 15:47:22.751484 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Sep 4 15:47:22.751560 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Sep 4 15:47:22.751614 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Sep 4 15:47:22.751660 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Sep 4 15:47:22.751710 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Sep 4 15:47:22.751756 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 4 15:47:22.751806 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Sep 4 15:47:22.751851 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Sep 4 15:47:22.751906 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Sep 4 15:47:22.751952 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Sep 4 15:47:22.752002 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Sep 4 15:47:22.752048 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Sep 4 15:47:22.752098 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Sep 4 15:47:22.752144 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 4 15:47:22.752202 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 4 15:47:22.752214 kernel: PCI: CLS 32 bytes, default 64 Sep 4 15:47:22.752220 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 4 15:47:22.752226 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 4 15:47:22.752232 kernel: clocksource: Switched to clocksource tsc Sep 4 15:47:22.752238 kernel: Initialise system trusted keyrings Sep 4 15:47:22.752244 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 4 15:47:22.752250 kernel: Key type asymmetric registered Sep 4 15:47:22.752256 kernel: Asymmetric key parser 'x509' registered Sep 4 15:47:22.752263 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 4 15:47:22.752269 kernel: io scheduler mq-deadline registered Sep 4 15:47:22.752274 kernel: io scheduler kyber registered Sep 4 15:47:22.752280 kernel: io scheduler bfq registered Sep 4 15:47:22.752333 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Sep 4 15:47:22.752386 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.752439 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Sep 4 15:47:22.752490 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.752587 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Sep 4 15:47:22.752659 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.752716 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Sep 4 15:47:22.752767 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.752819 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Sep 4 15:47:22.752869 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.752921 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Sep 4 15:47:22.752974 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753025 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Sep 4 15:47:22.753074 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753126 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Sep 4 15:47:22.753176 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753227 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Sep 4 15:47:22.753278 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753332 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Sep 4 15:47:22.753381 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753431 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Sep 4 15:47:22.753481 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753550 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Sep 4 15:47:22.753610 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753664 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Sep 4 15:47:22.753714 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753774 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Sep 4 15:47:22.753824 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753877 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Sep 4 15:47:22.753926 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.753979 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Sep 4 15:47:22.754029 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754080 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Sep 4 15:47:22.754133 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754184 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Sep 4 15:47:22.754234 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754285 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Sep 4 15:47:22.754335 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754386 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Sep 4 15:47:22.754437 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754491 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Sep 4 15:47:22.754551 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754603 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Sep 4 15:47:22.754653 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754704 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Sep 4 15:47:22.754754 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754805 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Sep 4 15:47:22.754855 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.754909 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Sep 4 15:47:22.754959 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.755010 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Sep 4 15:47:22.755060 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.755112 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Sep 4 15:47:22.755162 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.755213 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Sep 4 15:47:22.755266 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.755317 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Sep 4 15:47:22.755367 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.755418 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Sep 4 15:47:22.755468 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.756113 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Sep 4 15:47:22.756178 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.756233 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Sep 4 15:47:22.756289 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:47:22.756301 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 4 15:47:22.756308 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 4 15:47:22.756315 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 4 15:47:22.756321 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Sep 4 15:47:22.756327 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 4 15:47:22.756333 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 4 15:47:22.756388 kernel: rtc_cmos 00:01: registered as rtc0 Sep 4 15:47:22.756437 kernel: rtc_cmos 00:01: setting system clock to 2025-09-04T15:47:22 UTC (1757000842) Sep 4 15:47:22.756482 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Sep 4 15:47:22.756491 kernel: intel_pstate: CPU model not supported Sep 4 15:47:22.756506 kernel: NET: Registered PF_INET6 protocol family Sep 4 15:47:22.756513 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 4 15:47:22.756519 kernel: Segment Routing with IPv6 Sep 4 15:47:22.756527 kernel: In-situ OAM (IOAM) with IPv6 Sep 4 15:47:22.756534 kernel: NET: Registered PF_PACKET protocol family Sep 4 15:47:22.756541 kernel: Key type dns_resolver registered Sep 4 15:47:22.756547 kernel: IPI shorthand broadcast: enabled Sep 4 15:47:22.756554 kernel: sched_clock: Marking stable (2676338583, 170422768)->(2860411830, -13650479) Sep 4 15:47:22.756560 kernel: registered taskstats version 1 Sep 4 15:47:22.756566 kernel: Loading compiled-in X.509 certificates Sep 4 15:47:22.756573 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 1106dff6b31a2cb943a47c73d0d8dff07e2a7490' Sep 4 15:47:22.756579 kernel: Demotion targets for Node 0: null Sep 4 15:47:22.756585 kernel: Key type .fscrypt registered Sep 4 15:47:22.756591 kernel: Key type fscrypt-provisioning registered Sep 4 15:47:22.756599 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 4 15:47:22.756605 kernel: ima: Allocated hash algorithm: sha1 Sep 4 15:47:22.756611 kernel: ima: No architecture policies found Sep 4 15:47:22.756617 kernel: clk: Disabling unused clocks Sep 4 15:47:22.756624 kernel: Warning: unable to open an initial console. Sep 4 15:47:22.756630 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 4 15:47:22.756636 kernel: Write protecting the kernel read-only data: 24576k Sep 4 15:47:22.756642 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 4 15:47:22.756650 kernel: Run /init as init process Sep 4 15:47:22.756656 kernel: with arguments: Sep 4 15:47:22.756662 kernel: /init Sep 4 15:47:22.756669 kernel: with environment: Sep 4 15:47:22.756675 kernel: HOME=/ Sep 4 15:47:22.756681 kernel: TERM=linux Sep 4 15:47:22.756687 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 4 15:47:22.756694 systemd[1]: Successfully made /usr/ read-only. Sep 4 15:47:22.756703 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 15:47:22.756711 systemd[1]: Detected virtualization vmware. Sep 4 15:47:22.756718 systemd[1]: Detected architecture x86-64. Sep 4 15:47:22.756724 systemd[1]: Running in initrd. Sep 4 15:47:22.756730 systemd[1]: No hostname configured, using default hostname. Sep 4 15:47:22.756743 systemd[1]: Hostname set to . Sep 4 15:47:22.756750 systemd[1]: Initializing machine ID from random generator. Sep 4 15:47:22.756757 systemd[1]: Queued start job for default target initrd.target. Sep 4 15:47:22.756765 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:47:22.756772 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:47:22.756779 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 4 15:47:22.756786 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 15:47:22.756792 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 4 15:47:22.756799 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 4 15:47:22.756806 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 4 15:47:22.756814 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 4 15:47:22.756820 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:47:22.756827 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:47:22.756833 systemd[1]: Reached target paths.target - Path Units. Sep 4 15:47:22.756840 systemd[1]: Reached target slices.target - Slice Units. Sep 4 15:47:22.756846 systemd[1]: Reached target swap.target - Swaps. Sep 4 15:47:22.756852 systemd[1]: Reached target timers.target - Timer Units. Sep 4 15:47:22.756859 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 15:47:22.756865 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 15:47:22.756873 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 4 15:47:22.756879 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 4 15:47:22.756886 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:47:22.756893 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 15:47:22.756899 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:47:22.756906 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 15:47:22.756912 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 4 15:47:22.756918 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 15:47:22.756926 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 4 15:47:22.756933 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 4 15:47:22.756940 systemd[1]: Starting systemd-fsck-usr.service... Sep 4 15:47:22.756946 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 15:47:22.756953 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 15:47:22.756959 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:47:22.756966 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 4 15:47:22.756989 systemd-journald[244]: Collecting audit messages is disabled. Sep 4 15:47:22.757007 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:47:22.757014 systemd[1]: Finished systemd-fsck-usr.service. Sep 4 15:47:22.757020 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 15:47:22.757027 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 4 15:47:22.757033 kernel: Bridge firewalling registered Sep 4 15:47:22.757040 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 15:47:22.757047 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:47:22.757053 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 15:47:22.757060 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 15:47:22.757068 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 15:47:22.757074 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 15:47:22.757082 systemd-journald[244]: Journal started Sep 4 15:47:22.757096 systemd-journald[244]: Runtime Journal (/run/log/journal/ded34418c2e243ac80e30ee4678f1d2a) is 4.8M, max 38.8M, 34M free. Sep 4 15:47:22.706009 systemd-modules-load[246]: Inserted module 'overlay' Sep 4 15:47:22.737884 systemd-modules-load[246]: Inserted module 'br_netfilter' Sep 4 15:47:22.758508 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 15:47:22.764828 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 15:47:22.765414 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:47:22.766714 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:47:22.770649 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 15:47:22.773146 systemd-tmpfiles[275]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 4 15:47:22.773784 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 4 15:47:22.774652 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:47:22.776332 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 15:47:22.782880 dracut-cmdline[281]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:47:22.801089 systemd-resolved[283]: Positive Trust Anchors: Sep 4 15:47:22.801296 systemd-resolved[283]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 15:47:22.801465 systemd-resolved[283]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 15:47:22.803685 systemd-resolved[283]: Defaulting to hostname 'linux'. Sep 4 15:47:22.804371 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 15:47:22.804517 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:47:22.835538 kernel: SCSI subsystem initialized Sep 4 15:47:22.852511 kernel: Loading iSCSI transport class v2.0-870. Sep 4 15:47:22.860507 kernel: iscsi: registered transport (tcp) Sep 4 15:47:22.882539 kernel: iscsi: registered transport (qla4xxx) Sep 4 15:47:22.882581 kernel: QLogic iSCSI HBA Driver Sep 4 15:47:22.893700 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 15:47:22.912471 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:47:22.913570 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 15:47:22.935836 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 4 15:47:22.936792 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 4 15:47:22.975523 kernel: raid6: avx2x4 gen() 46926 MB/s Sep 4 15:47:22.990514 kernel: raid6: avx2x2 gen() 52631 MB/s Sep 4 15:47:23.007706 kernel: raid6: avx2x1 gen() 44600 MB/s Sep 4 15:47:23.007748 kernel: raid6: using algorithm avx2x2 gen() 52631 MB/s Sep 4 15:47:23.025747 kernel: raid6: .... xor() 31835 MB/s, rmw enabled Sep 4 15:47:23.025798 kernel: raid6: using avx2x2 recovery algorithm Sep 4 15:47:23.039514 kernel: xor: automatically using best checksumming function avx Sep 4 15:47:23.147517 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 4 15:47:23.151320 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 4 15:47:23.152457 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:47:23.173670 systemd-udevd[492]: Using default interface naming scheme 'v255'. Sep 4 15:47:23.177383 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:47:23.178542 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 4 15:47:23.201207 dracut-pre-trigger[496]: rd.md=0: removing MD RAID activation Sep 4 15:47:23.215985 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 15:47:23.217096 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 15:47:23.291345 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:47:23.292841 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 4 15:47:23.354515 kernel: VMware PVSCSI driver - version 1.0.7.0-k Sep 4 15:47:23.356515 kernel: vmw_pvscsi: using 64bit dma Sep 4 15:47:23.357528 kernel: vmw_pvscsi: max_id: 16 Sep 4 15:47:23.357548 kernel: vmw_pvscsi: setting ring_pages to 8 Sep 4 15:47:23.361812 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Sep 4 15:47:23.361836 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Sep 4 15:47:23.363506 kernel: vmw_pvscsi: enabling reqCallThreshold Sep 4 15:47:23.363523 kernel: vmw_pvscsi: driver-based request coalescing enabled Sep 4 15:47:23.363531 kernel: vmw_pvscsi: using MSI-X Sep 4 15:47:23.369562 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Sep 4 15:47:23.369701 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Sep 4 15:47:23.375511 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Sep 4 15:47:23.375643 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Sep 4 15:47:23.399048 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Sep 4 15:47:23.401535 kernel: cryptd: max_cpu_qlen set to 1000 Sep 4 15:47:23.401953 (udev-worker)[543]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 4 15:47:23.407974 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 15:47:23.408223 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:47:23.408692 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:47:23.410075 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:47:23.411510 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Sep 4 15:47:23.411619 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 4 15:47:23.413797 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Sep 4 15:47:23.413883 kernel: sd 0:0:0:0: [sda] Cache data unavailable Sep 4 15:47:23.413947 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Sep 4 15:47:23.415505 kernel: libata version 3.00 loaded. Sep 4 15:47:23.419729 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Sep 4 15:47:23.419752 kernel: AES CTR mode by8 optimization enabled Sep 4 15:47:23.438690 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:47:23.468516 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 4 15:47:23.469528 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 4 15:47:23.475513 kernel: ata_piix 0000:00:07.1: version 2.13 Sep 4 15:47:23.476512 kernel: scsi host1: ata_piix Sep 4 15:47:23.478978 kernel: scsi host2: ata_piix Sep 4 15:47:23.479061 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Sep 4 15:47:23.479070 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Sep 4 15:47:23.594548 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Sep 4 15:47:23.600048 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Sep 4 15:47:23.605557 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 4 15:47:23.610021 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Sep 4 15:47:23.610315 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Sep 4 15:47:23.611142 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 4 15:47:23.648515 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Sep 4 15:47:23.661563 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Sep 4 15:47:23.675522 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 4 15:47:23.687583 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Sep 4 15:47:23.687766 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 4 15:47:23.704511 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 4 15:47:23.715517 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 4 15:47:24.131734 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 4 15:47:24.132287 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 15:47:24.132492 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:47:24.132721 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 15:47:24.133441 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 4 15:47:24.162204 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 4 15:47:24.746523 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 4 15:47:24.746702 disk-uuid[641]: The operation has completed successfully. Sep 4 15:47:24.909908 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 4 15:47:24.910168 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 4 15:47:24.926482 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 4 15:47:24.937662 sh[673]: Success Sep 4 15:47:24.953522 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 4 15:47:24.953570 kernel: device-mapper: uevent: version 1.0.3 Sep 4 15:47:24.956066 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 4 15:47:24.962517 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 4 15:47:25.036062 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 4 15:47:25.037551 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 4 15:47:25.046965 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 4 15:47:25.199511 kernel: BTRFS: device fsid 03d586f6-54f4-4e78-a040-c693154b15e4 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (685) Sep 4 15:47:25.211584 kernel: BTRFS info (device dm-0): first mount of filesystem 03d586f6-54f4-4e78-a040-c693154b15e4 Sep 4 15:47:25.211609 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:47:25.340981 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 4 15:47:25.341032 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 4 15:47:25.341047 kernel: BTRFS info (device dm-0): enabling free space tree Sep 4 15:47:25.682308 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 4 15:47:25.682868 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 4 15:47:25.683560 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Sep 4 15:47:25.685558 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 4 15:47:25.935524 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (708) Sep 4 15:47:25.961286 kernel: BTRFS info (device sda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:47:25.961328 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:47:26.046145 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 4 15:47:26.046218 kernel: BTRFS info (device sda6): enabling free space tree Sep 4 15:47:26.050518 kernel: BTRFS info (device sda6): last unmount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:47:26.051180 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 4 15:47:26.052678 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 4 15:47:26.231595 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 4 15:47:26.232890 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 4 15:47:26.303807 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 15:47:26.304967 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 15:47:26.334848 systemd-networkd[860]: lo: Link UP Sep 4 15:47:26.335046 systemd-networkd[860]: lo: Gained carrier Sep 4 15:47:26.335840 systemd-networkd[860]: Enumeration completed Sep 4 15:47:26.336295 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 15:47:26.336445 systemd[1]: Reached target network.target - Network. Sep 4 15:47:26.336902 systemd-networkd[860]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Sep 4 15:47:26.339768 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 4 15:47:26.339903 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 4 15:47:26.340388 systemd-networkd[860]: ens192: Link UP Sep 4 15:47:26.340394 systemd-networkd[860]: ens192: Gained carrier Sep 4 15:47:26.439519 ignition[727]: Ignition 2.22.0 Sep 4 15:47:26.439810 ignition[727]: Stage: fetch-offline Sep 4 15:47:26.439842 ignition[727]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:47:26.439849 ignition[727]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:47:26.439912 ignition[727]: parsed url from cmdline: "" Sep 4 15:47:26.439915 ignition[727]: no config URL provided Sep 4 15:47:26.439918 ignition[727]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 15:47:26.439924 ignition[727]: no config at "/usr/lib/ignition/user.ign" Sep 4 15:47:26.440409 ignition[727]: config successfully fetched Sep 4 15:47:26.440433 ignition[727]: parsing config with SHA512: df500fa6bf5359924a62fd939504740e61780e24781400c69f9390a2c443c3de4005716f12125c4144cec50356cac622fcdb41412b3fb36b10ba74614a1574f1 Sep 4 15:47:26.443415 unknown[727]: fetched base config from "system" Sep 4 15:47:26.443427 unknown[727]: fetched user config from "vmware" Sep 4 15:47:26.444286 ignition[727]: fetch-offline: fetch-offline passed Sep 4 15:47:26.444452 ignition[727]: Ignition finished successfully Sep 4 15:47:26.445803 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 15:47:26.446074 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 4 15:47:26.446618 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 4 15:47:26.468479 ignition[870]: Ignition 2.22.0 Sep 4 15:47:26.468490 ignition[870]: Stage: kargs Sep 4 15:47:26.468585 ignition[870]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:47:26.468591 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:47:26.469072 ignition[870]: kargs: kargs passed Sep 4 15:47:26.469098 ignition[870]: Ignition finished successfully Sep 4 15:47:26.470664 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 4 15:47:26.471386 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 4 15:47:26.489959 ignition[876]: Ignition 2.22.0 Sep 4 15:47:26.490318 ignition[876]: Stage: disks Sep 4 15:47:26.490558 ignition[876]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:47:26.490678 ignition[876]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:47:26.491404 ignition[876]: disks: disks passed Sep 4 15:47:26.491535 ignition[876]: Ignition finished successfully Sep 4 15:47:26.492444 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 4 15:47:26.492837 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 4 15:47:26.492993 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 4 15:47:26.493190 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 15:47:26.493376 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 15:47:26.493570 systemd[1]: Reached target basic.target - Basic System. Sep 4 15:47:26.494356 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 4 15:47:27.346748 systemd-fsck[885]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 4 15:47:27.355205 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 4 15:47:27.356416 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 4 15:47:27.379570 systemd-networkd[860]: ens192: Gained IPv6LL Sep 4 15:47:27.499511 kernel: EXT4-fs (sda9): mounted filesystem b9579306-9cef-42ea-893b-17169f1ea8af r/w with ordered data mode. Quota mode: none. Sep 4 15:47:27.500070 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 4 15:47:27.500555 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 4 15:47:27.501726 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 15:47:27.503534 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 4 15:47:27.503919 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 4 15:47:27.503944 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 4 15:47:27.503958 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 15:47:27.509606 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 4 15:47:27.510327 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 4 15:47:27.545521 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (893) Sep 4 15:47:27.545560 kernel: BTRFS info (device sda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:47:27.547578 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:47:27.551757 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 4 15:47:27.551784 kernel: BTRFS info (device sda6): enabling free space tree Sep 4 15:47:27.552806 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 15:47:27.569970 initrd-setup-root[917]: cut: /sysroot/etc/passwd: No such file or directory Sep 4 15:47:27.573184 initrd-setup-root[924]: cut: /sysroot/etc/group: No such file or directory Sep 4 15:47:27.575362 initrd-setup-root[931]: cut: /sysroot/etc/shadow: No such file or directory Sep 4 15:47:27.577343 initrd-setup-root[938]: cut: /sysroot/etc/gshadow: No such file or directory Sep 4 15:47:27.662209 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 4 15:47:27.663037 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 4 15:47:27.663563 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 4 15:47:27.675997 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 4 15:47:27.677529 kernel: BTRFS info (device sda6): last unmount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:47:27.691335 ignition[1006]: INFO : Ignition 2.22.0 Sep 4 15:47:27.691335 ignition[1006]: INFO : Stage: mount Sep 4 15:47:27.691676 ignition[1006]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:47:27.691676 ignition[1006]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:47:27.691916 ignition[1006]: INFO : mount: mount passed Sep 4 15:47:27.692673 ignition[1006]: INFO : Ignition finished successfully Sep 4 15:47:27.692912 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 4 15:47:27.694557 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 4 15:47:27.704611 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 4 15:47:28.501392 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 15:47:28.521514 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1017) Sep 4 15:47:28.532310 kernel: BTRFS info (device sda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:47:28.532357 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:47:28.559545 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 4 15:47:28.559601 kernel: BTRFS info (device sda6): enabling free space tree Sep 4 15:47:28.560994 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 15:47:28.584508 ignition[1033]: INFO : Ignition 2.22.0 Sep 4 15:47:28.584508 ignition[1033]: INFO : Stage: files Sep 4 15:47:28.585484 ignition[1033]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:47:28.585484 ignition[1033]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:47:28.585484 ignition[1033]: DEBUG : files: compiled without relabeling support, skipping Sep 4 15:47:28.586035 ignition[1033]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 4 15:47:28.586179 ignition[1033]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 4 15:47:28.588752 ignition[1033]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 4 15:47:28.589085 ignition[1033]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 4 15:47:28.589522 unknown[1033]: wrote ssh authorized keys file for user: core Sep 4 15:47:28.589845 ignition[1033]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 4 15:47:28.606909 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 4 15:47:28.607468 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 4 15:47:28.755758 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 4 15:47:29.315240 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 4 15:47:29.315524 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 4 15:47:29.315524 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 4 15:47:29.315524 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 4 15:47:29.315524 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 4 15:47:29.315524 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 15:47:29.316284 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 15:47:29.316284 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 15:47:29.316284 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 15:47:29.317539 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 15:47:29.317697 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 15:47:29.317697 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 15:47:29.318054 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 15:47:29.318054 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 15:47:29.318446 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 4 15:47:29.889901 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 4 15:47:30.291544 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 15:47:30.291544 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 4 15:47:30.292354 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 4 15:47:30.292542 ignition[1033]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Sep 4 15:47:30.292886 ignition[1033]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 15:47:30.293296 ignition[1033]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 15:47:30.293296 ignition[1033]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Sep 4 15:47:30.293296 ignition[1033]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Sep 4 15:47:30.293726 ignition[1033]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 15:47:30.293726 ignition[1033]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 15:47:30.293726 ignition[1033]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Sep 4 15:47:30.293726 ignition[1033]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Sep 4 15:47:30.317430 ignition[1033]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 15:47:30.320206 ignition[1033]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 15:47:30.320372 ignition[1033]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Sep 4 15:47:30.320372 ignition[1033]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Sep 4 15:47:30.320372 ignition[1033]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Sep 4 15:47:30.321411 ignition[1033]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 4 15:47:30.321411 ignition[1033]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 4 15:47:30.321411 ignition[1033]: INFO : files: files passed Sep 4 15:47:30.321411 ignition[1033]: INFO : Ignition finished successfully Sep 4 15:47:30.321661 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 4 15:47:30.322605 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 4 15:47:30.323589 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 4 15:47:30.328517 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 4 15:47:30.328596 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 4 15:47:30.331772 initrd-setup-root-after-ignition[1065]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:47:30.331772 initrd-setup-root-after-ignition[1065]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:47:30.332936 initrd-setup-root-after-ignition[1069]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:47:30.333709 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 15:47:30.334033 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 4 15:47:30.334607 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 4 15:47:30.352199 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 4 15:47:30.352263 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 4 15:47:30.352554 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 4 15:47:30.352823 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 4 15:47:30.353021 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 4 15:47:30.353474 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 4 15:47:30.368989 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 15:47:30.369941 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 4 15:47:30.382332 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:47:30.382516 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:47:30.382761 systemd[1]: Stopped target timers.target - Timer Units. Sep 4 15:47:30.382976 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 4 15:47:30.383047 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 15:47:30.383414 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 4 15:47:30.383697 systemd[1]: Stopped target basic.target - Basic System. Sep 4 15:47:30.383883 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 4 15:47:30.384087 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 15:47:30.384296 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 4 15:47:30.384532 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 4 15:47:30.384734 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 4 15:47:30.384948 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 15:47:30.385166 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 4 15:47:30.385384 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 4 15:47:30.385577 systemd[1]: Stopped target swap.target - Swaps. Sep 4 15:47:30.385758 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 4 15:47:30.385819 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 4 15:47:30.386077 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:47:30.386318 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:47:30.386534 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 4 15:47:30.386578 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:47:30.386756 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 4 15:47:30.386814 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 4 15:47:30.387095 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 4 15:47:30.387163 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 15:47:30.387397 systemd[1]: Stopped target paths.target - Path Units. Sep 4 15:47:30.387559 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 4 15:47:30.391634 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:47:30.391817 systemd[1]: Stopped target slices.target - Slice Units. Sep 4 15:47:30.392042 systemd[1]: Stopped target sockets.target - Socket Units. Sep 4 15:47:30.392210 systemd[1]: iscsid.socket: Deactivated successfully. Sep 4 15:47:30.392258 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 15:47:30.392404 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 4 15:47:30.392450 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 15:47:30.392622 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 4 15:47:30.392687 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 15:47:30.392943 systemd[1]: ignition-files.service: Deactivated successfully. Sep 4 15:47:30.393003 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 4 15:47:30.394591 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 4 15:47:30.395081 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 4 15:47:30.395184 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 4 15:47:30.395248 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:47:30.395398 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 4 15:47:30.395455 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 15:47:30.398670 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 4 15:47:30.402540 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 4 15:47:30.411240 ignition[1090]: INFO : Ignition 2.22.0 Sep 4 15:47:30.411240 ignition[1090]: INFO : Stage: umount Sep 4 15:47:30.411603 ignition[1090]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:47:30.411603 ignition[1090]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:47:30.412063 ignition[1090]: INFO : umount: umount passed Sep 4 15:47:30.412063 ignition[1090]: INFO : Ignition finished successfully Sep 4 15:47:30.413028 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 4 15:47:30.413212 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 4 15:47:30.413436 systemd[1]: Stopped target network.target - Network. Sep 4 15:47:30.413550 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 4 15:47:30.413578 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 4 15:47:30.413721 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 4 15:47:30.413743 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 4 15:47:30.413893 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 4 15:47:30.413914 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 4 15:47:30.414057 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 4 15:47:30.414082 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 4 15:47:30.414420 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 4 15:47:30.415183 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 4 15:47:30.420000 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 4 15:47:30.420085 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 4 15:47:30.422102 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 4 15:47:30.422266 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 4 15:47:30.422299 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:47:30.423330 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 4 15:47:30.424542 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 4 15:47:30.424625 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 4 15:47:30.425654 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 4 15:47:30.426208 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 4 15:47:30.426558 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 4 15:47:30.426710 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 4 15:47:30.426732 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:47:30.427598 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 4 15:47:30.427720 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 4 15:47:30.427752 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 15:47:30.428576 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Sep 4 15:47:30.428606 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 4 15:47:30.428735 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 4 15:47:30.428760 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:47:30.429762 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 4 15:47:30.429913 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 4 15:47:30.430583 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:47:30.431739 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 4 15:47:30.433707 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 4 15:47:30.433900 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 4 15:47:30.434234 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 4 15:47:30.434402 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 4 15:47:30.443924 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 4 15:47:30.444146 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:47:30.444475 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 4 15:47:30.444520 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 4 15:47:30.444652 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 4 15:47:30.444672 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:47:30.444783 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 4 15:47:30.444809 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 4 15:47:30.444981 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 4 15:47:30.445005 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 4 15:47:30.445332 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 15:47:30.445358 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 15:47:30.446597 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 4 15:47:30.446723 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 4 15:47:30.446752 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:47:30.447201 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 4 15:47:30.447233 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:47:30.447556 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 4 15:47:30.447589 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 15:47:30.448836 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 4 15:47:30.448870 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:47:30.449404 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 15:47:30.449429 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:47:30.450030 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 4 15:47:30.456689 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 4 15:47:30.460213 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 4 15:47:30.460299 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 4 15:47:30.460716 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 4 15:47:30.461352 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 4 15:47:30.475079 systemd[1]: Switching root. Sep 4 15:47:30.513383 systemd-journald[244]: Journal stopped Sep 4 15:47:32.144206 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 4 15:47:32.144345 kernel: SELinux: policy capability network_peer_controls=1 Sep 4 15:47:32.144356 kernel: SELinux: policy capability open_perms=1 Sep 4 15:47:32.144363 kernel: SELinux: policy capability extended_socket_class=1 Sep 4 15:47:32.144368 kernel: SELinux: policy capability always_check_network=0 Sep 4 15:47:32.144375 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 4 15:47:32.144382 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 4 15:47:32.144387 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 4 15:47:32.144393 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 4 15:47:32.144399 kernel: SELinux: policy capability userspace_initial_context=0 Sep 4 15:47:32.144405 kernel: audit: type=1403 audit(1757000851.418:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 4 15:47:32.144412 systemd[1]: Successfully loaded SELinux policy in 74.959ms. Sep 4 15:47:32.144420 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.291ms. Sep 4 15:47:32.144428 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 15:47:32.144435 systemd[1]: Detected virtualization vmware. Sep 4 15:47:32.144442 systemd[1]: Detected architecture x86-64. Sep 4 15:47:32.144450 systemd[1]: Detected first boot. Sep 4 15:47:32.144457 systemd[1]: Initializing machine ID from random generator. Sep 4 15:47:32.144464 zram_generator::config[1134]: No configuration found. Sep 4 15:47:32.144572 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Sep 4 15:47:32.149984 kernel: Guest personality initialized and is active Sep 4 15:47:32.149995 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 4 15:47:32.150002 kernel: Initialized host personality Sep 4 15:47:32.150012 kernel: NET: Registered PF_VSOCK protocol family Sep 4 15:47:32.150019 systemd[1]: Populated /etc with preset unit settings. Sep 4 15:47:32.150029 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:47:32.150036 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Sep 4 15:47:32.150043 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 4 15:47:32.150050 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 4 15:47:32.150056 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 4 15:47:32.150064 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 4 15:47:32.150072 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 4 15:47:32.150079 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 4 15:47:32.150085 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 4 15:47:32.150092 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 4 15:47:32.150099 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 4 15:47:32.150106 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 4 15:47:32.150114 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 4 15:47:32.150121 systemd[1]: Created slice user.slice - User and Session Slice. Sep 4 15:47:32.150128 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:47:32.150137 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:47:32.150144 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 4 15:47:32.150151 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 4 15:47:32.150158 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 4 15:47:32.150165 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 15:47:32.150173 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 4 15:47:32.150180 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:47:32.150187 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:47:32.150194 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 4 15:47:32.150201 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 4 15:47:32.150208 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 4 15:47:32.150214 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 4 15:47:32.150224 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:47:32.150232 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 15:47:32.150239 systemd[1]: Reached target slices.target - Slice Units. Sep 4 15:47:32.150247 systemd[1]: Reached target swap.target - Swaps. Sep 4 15:47:32.150254 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 4 15:47:32.150261 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 4 15:47:32.150269 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 4 15:47:32.150276 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:47:32.150283 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 15:47:32.150290 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:47:32.150297 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 4 15:47:32.150304 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 4 15:47:32.150311 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 4 15:47:32.150318 systemd[1]: Mounting media.mount - External Media Directory... Sep 4 15:47:32.150326 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:47:32.150333 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 4 15:47:32.150340 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 4 15:47:32.150347 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 4 15:47:32.150354 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 4 15:47:32.150361 systemd[1]: Reached target machines.target - Containers. Sep 4 15:47:32.150368 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 4 15:47:32.150376 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Sep 4 15:47:32.150384 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 15:47:32.150391 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 4 15:47:32.150398 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:47:32.150405 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 15:47:32.150416 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 15:47:32.150424 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 4 15:47:32.150438 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 15:47:32.150447 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 4 15:47:32.150455 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 4 15:47:32.150462 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 4 15:47:32.154615 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 4 15:47:32.154636 systemd[1]: Stopped systemd-fsck-usr.service. Sep 4 15:47:32.154645 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:47:32.154653 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 15:47:32.154660 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 15:47:32.154668 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 15:47:32.154675 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 4 15:47:32.154685 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 4 15:47:32.154692 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 15:47:32.154699 systemd[1]: verity-setup.service: Deactivated successfully. Sep 4 15:47:32.154706 systemd[1]: Stopped verity-setup.service. Sep 4 15:47:32.154714 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:47:32.154721 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 4 15:47:32.154728 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 4 15:47:32.154758 systemd-journald[1229]: Collecting audit messages is disabled. Sep 4 15:47:32.154777 kernel: fuse: init (API version 7.41) Sep 4 15:47:32.154785 systemd[1]: Mounted media.mount - External Media Directory. Sep 4 15:47:32.154792 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 4 15:47:32.154799 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 4 15:47:32.154807 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 4 15:47:32.154814 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:47:32.154822 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 4 15:47:32.154829 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 4 15:47:32.154836 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:47:32.154843 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:47:32.154849 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 4 15:47:32.154857 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 15:47:32.154864 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 15:47:32.154872 systemd-journald[1229]: Journal started Sep 4 15:47:32.154887 systemd-journald[1229]: Runtime Journal (/run/log/journal/6f1e6bbf17074e7391977ccd8bafc9d9) is 4.8M, max 38.8M, 34M free. Sep 4 15:47:31.986592 systemd[1]: Queued start job for default target multi-user.target. Sep 4 15:47:31.992996 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 4 15:47:31.993295 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 4 15:47:32.155428 jq[1204]: true Sep 4 15:47:32.170807 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 15:47:32.156671 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 4 15:47:32.156783 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 4 15:47:32.157053 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 15:47:32.157308 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:47:32.158885 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 4 15:47:32.161300 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 15:47:32.164549 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 4 15:47:32.167687 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 4 15:47:32.167819 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 4 15:47:32.167838 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 15:47:32.170113 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 4 15:47:32.179566 kernel: loop: module loaded Sep 4 15:47:32.174428 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 4 15:47:32.174621 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:47:32.176564 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 4 15:47:32.179806 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 4 15:47:32.179956 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 15:47:32.182394 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 4 15:47:32.184617 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 15:47:32.191783 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 4 15:47:32.195640 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 15:47:32.196899 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 15:47:32.201390 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 15:47:32.202525 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 4 15:47:32.202802 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 4 15:47:32.202955 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 4 15:47:32.208599 jq[1250]: true Sep 4 15:47:32.215919 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 15:47:32.220472 systemd-journald[1229]: Time spent on flushing to /var/log/journal/6f1e6bbf17074e7391977ccd8bafc9d9 is 15.102ms for 1759 entries. Sep 4 15:47:32.220472 systemd-journald[1229]: System Journal (/var/log/journal/6f1e6bbf17074e7391977ccd8bafc9d9) is 8M, max 584.8M, 576.8M free. Sep 4 15:47:32.253703 systemd-journald[1229]: Received client request to flush runtime journal. Sep 4 15:47:32.253726 kernel: ACPI: bus type drm_connector registered Sep 4 15:47:32.253909 kernel: loop0: detected capacity change from 0 to 221472 Sep 4 15:47:32.233002 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 4 15:47:32.233394 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 4 15:47:32.235133 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 4 15:47:32.235872 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 15:47:32.235982 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 15:47:32.256057 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 4 15:47:32.258183 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. Sep 4 15:47:32.258198 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. Sep 4 15:47:32.258597 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:47:32.264015 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 15:47:32.267472 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 4 15:47:32.277317 ignition[1276]: Ignition 2.22.0 Sep 4 15:47:32.277712 ignition[1276]: deleting config from guestinfo properties Sep 4 15:47:32.296451 ignition[1276]: Successfully deleted config Sep 4 15:47:32.298311 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Sep 4 15:47:32.305540 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 4 15:47:32.329921 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:47:32.347515 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 4 15:47:32.420793 kernel: loop1: detected capacity change from 0 to 2960 Sep 4 15:47:32.429599 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 4 15:47:32.430653 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 15:47:32.454125 systemd-tmpfiles[1305]: ACLs are not supported, ignoring. Sep 4 15:47:32.454137 systemd-tmpfiles[1305]: ACLs are not supported, ignoring. Sep 4 15:47:32.459581 kernel: loop2: detected capacity change from 0 to 128016 Sep 4 15:47:32.460710 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:47:32.606532 kernel: loop3: detected capacity change from 0 to 110984 Sep 4 15:47:32.655515 kernel: loop4: detected capacity change from 0 to 221472 Sep 4 15:47:32.725509 kernel: loop5: detected capacity change from 0 to 2960 Sep 4 15:47:32.738513 kernel: loop6: detected capacity change from 0 to 128016 Sep 4 15:47:32.821516 kernel: loop7: detected capacity change from 0 to 110984 Sep 4 15:47:32.842666 (sd-merge)[1311]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Sep 4 15:47:32.844169 (sd-merge)[1311]: Merged extensions into '/usr'. Sep 4 15:47:32.848087 systemd[1]: Reload requested from client PID 1266 ('systemd-sysext') (unit systemd-sysext.service)... Sep 4 15:47:32.848099 systemd[1]: Reloading... Sep 4 15:47:32.890967 zram_generator::config[1334]: No configuration found. Sep 4 15:47:32.978988 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:47:33.023876 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 4 15:47:33.024019 systemd[1]: Reloading finished in 175 ms. Sep 4 15:47:33.039337 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 4 15:47:33.039656 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 4 15:47:33.044398 systemd[1]: Starting ensure-sysext.service... Sep 4 15:47:33.047322 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 15:47:33.048448 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:47:33.066553 systemd[1]: Reload requested from client PID 1394 ('systemctl') (unit ensure-sysext.service)... Sep 4 15:47:33.066562 systemd[1]: Reloading... Sep 4 15:47:33.071449 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 4 15:47:33.071468 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 4 15:47:33.075370 systemd-udevd[1396]: Using default interface naming scheme 'v255'. Sep 4 15:47:33.076815 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 4 15:47:33.077048 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 4 15:47:33.077851 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 4 15:47:33.078074 systemd-tmpfiles[1395]: ACLs are not supported, ignoring. Sep 4 15:47:33.078109 systemd-tmpfiles[1395]: ACLs are not supported, ignoring. Sep 4 15:47:33.098670 systemd-tmpfiles[1395]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 15:47:33.098676 systemd-tmpfiles[1395]: Skipping /boot Sep 4 15:47:33.104871 systemd-tmpfiles[1395]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 15:47:33.104921 systemd-tmpfiles[1395]: Skipping /boot Sep 4 15:47:33.131517 zram_generator::config[1427]: No configuration found. Sep 4 15:47:33.279364 kernel: mousedev: PS/2 mouse device common for all mice Sep 4 15:47:33.291195 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:47:33.312512 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 4 15:47:33.318512 kernel: ACPI: button: Power Button [PWRF] Sep 4 15:47:33.346509 ldconfig[1261]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 4 15:47:33.360986 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 4 15:47:33.361061 systemd[1]: Reloading finished in 294 ms. Sep 4 15:47:33.368871 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:47:33.369554 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 4 15:47:33.369833 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:47:33.393592 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 4 15:47:33.398906 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:47:33.401622 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 15:47:33.404020 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 4 15:47:33.406646 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:47:33.409804 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 15:47:33.412390 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 15:47:33.418056 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 15:47:33.418278 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:47:33.419410 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 4 15:47:33.419610 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:47:33.423688 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 4 15:47:33.427710 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 15:47:33.432349 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 15:47:33.436981 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 4 15:47:33.437129 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:47:33.438239 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 15:47:33.439564 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 15:47:33.444759 systemd[1]: Finished ensure-sysext.service. Sep 4 15:47:33.445062 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 15:47:33.445183 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 15:47:33.451279 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 4 15:47:33.453419 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 15:47:33.454960 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 15:47:33.455240 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 15:47:33.458701 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:47:33.458841 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:47:33.459061 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 15:47:33.460545 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Sep 4 15:47:33.462486 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 4 15:47:33.472115 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 4 15:47:33.481335 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 4 15:47:33.482056 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 4 15:47:33.485237 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 4 15:47:33.488196 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 4 15:47:33.488797 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 4 15:47:33.503090 augenrules[1561]: No rules Sep 4 15:47:33.504268 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 15:47:33.504681 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 15:47:33.507803 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 4 15:47:33.511069 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 4 15:47:33.551257 (udev-worker)[1433]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 4 15:47:33.567110 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:47:33.629956 systemd-networkd[1532]: lo: Link UP Sep 4 15:47:33.629961 systemd-networkd[1532]: lo: Gained carrier Sep 4 15:47:33.631749 systemd-networkd[1532]: Enumeration completed Sep 4 15:47:33.631809 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 15:47:33.631964 systemd-networkd[1532]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Sep 4 15:47:33.635433 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 4 15:47:33.635594 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 4 15:47:33.634584 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 4 15:47:33.635812 systemd-networkd[1532]: ens192: Link UP Sep 4 15:47:33.635895 systemd-networkd[1532]: ens192: Gained carrier Sep 4 15:47:33.637142 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 4 15:47:33.649766 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 4 15:47:33.674015 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 4 15:47:33.674184 systemd[1]: Reached target time-set.target - System Time Set. Sep 4 15:47:33.682436 systemd-resolved[1533]: Positive Trust Anchors: Sep 4 15:47:33.682647 systemd-resolved[1533]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 15:47:33.682705 systemd-resolved[1533]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 15:47:33.686538 systemd-resolved[1533]: Defaulting to hostname 'linux'. Sep 4 15:47:33.687647 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 15:47:33.687806 systemd[1]: Reached target network.target - Network. Sep 4 15:47:33.687896 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:47:33.700787 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:47:33.701326 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 15:47:33.701570 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 4 15:47:33.701752 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 4 15:47:33.701910 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 4 15:47:33.702100 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 4 15:47:33.702247 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 4 15:47:33.702362 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 4 15:47:33.702473 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 4 15:47:33.702488 systemd[1]: Reached target paths.target - Path Units. Sep 4 15:47:33.702610 systemd[1]: Reached target timers.target - Timer Units. Sep 4 15:47:33.703421 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 4 15:47:33.704702 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 4 15:47:33.706019 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 4 15:47:33.706220 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 4 15:47:33.706347 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 4 15:47:33.715755 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 4 15:47:33.721857 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 4 15:47:33.722407 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 4 15:47:33.722923 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 15:47:33.723027 systemd[1]: Reached target basic.target - Basic System. Sep 4 15:47:33.723150 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 4 15:47:33.723168 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 4 15:47:33.723905 systemd[1]: Starting containerd.service - containerd container runtime... Sep 4 15:47:33.726576 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 4 15:47:33.729073 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 4 15:47:33.731909 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 4 15:47:33.733196 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 4 15:47:33.733317 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 4 15:47:33.736954 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 4 15:47:33.737863 jq[1607]: false Sep 4 15:47:33.738264 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 4 15:47:33.740701 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 4 15:47:33.742538 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 4 15:47:33.745449 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 4 15:47:33.751226 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 4 15:47:33.752004 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 4 15:47:33.752423 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 4 15:47:33.754372 systemd[1]: Starting update-engine.service - Update Engine... Sep 4 15:47:33.757054 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 4 15:47:33.759657 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Sep 4 15:47:33.762356 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Refreshing passwd entry cache Sep 4 15:47:33.762358 oslogin_cache_refresh[1609]: Refreshing passwd entry cache Sep 4 15:47:33.763116 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 4 15:47:33.763388 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 4 15:47:33.763519 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 4 15:47:33.765224 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 4 15:47:33.767166 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 4 15:47:33.773508 extend-filesystems[1608]: Found /dev/sda6 Sep 4 15:47:33.772683 oslogin_cache_refresh[1609]: Failure getting users, quitting Sep 4 15:47:33.773784 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Failure getting users, quitting Sep 4 15:47:33.773784 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 15:47:33.773784 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Refreshing group entry cache Sep 4 15:47:33.772694 oslogin_cache_refresh[1609]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 15:47:33.772725 oslogin_cache_refresh[1609]: Refreshing group entry cache Sep 4 15:47:33.780623 extend-filesystems[1608]: Found /dev/sda9 Sep 4 15:47:33.777963 oslogin_cache_refresh[1609]: Failure getting groups, quitting Sep 4 15:47:33.780803 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Failure getting groups, quitting Sep 4 15:47:33.780803 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 15:47:33.777174 systemd[1]: motdgen.service: Deactivated successfully. Sep 4 15:47:33.777970 oslogin_cache_refresh[1609]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 15:47:33.777342 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 4 15:47:33.779645 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 4 15:47:33.780083 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 4 15:47:33.782512 extend-filesystems[1608]: Checking size of /dev/sda9 Sep 4 15:47:33.782856 jq[1623]: true Sep 4 15:47:33.789805 update_engine[1621]: I20250904 15:47:33.789463 1621 main.cc:92] Flatcar Update Engine starting Sep 4 15:47:33.804054 (ntainerd)[1644]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 4 15:47:33.805513 extend-filesystems[1608]: Old size kept for /dev/sda9 Sep 4 15:47:33.805363 dbus-daemon[1605]: [system] SELinux support is enabled Sep 4 15:47:33.806014 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 4 15:47:33.808721 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 4 15:47:33.808863 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 4 15:47:33.809151 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 4 15:47:33.809165 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 4 15:47:33.809533 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 4 15:47:33.809543 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 4 15:47:33.814732 update_engine[1621]: I20250904 15:47:33.814627 1621 update_check_scheduler.cc:74] Next update check in 8m11s Sep 4 15:47:33.815126 systemd[1]: Started update-engine.service - Update Engine. Sep 4 15:47:33.815807 tar[1630]: linux-amd64/helm Sep 4 15:47:33.819502 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 4 15:47:33.824816 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Sep 4 15:47:33.830953 jq[1642]: true Sep 4 15:47:33.831216 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Sep 4 15:47:33.880204 unknown[1656]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Sep 4 15:47:33.880957 unknown[1656]: Core dump limit set to -1 Sep 4 15:47:33.893592 bash[1674]: Updated "/home/core/.ssh/authorized_keys" Sep 4 15:47:33.895832 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 4 15:47:33.896810 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 4 15:47:33.908146 systemd-logind[1618]: Watching system buttons on /dev/input/event2 (Power Button) Sep 4 15:47:33.908167 systemd-logind[1618]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 4 15:47:33.908720 systemd-logind[1618]: New seat seat0. Sep 4 15:47:33.909732 systemd[1]: Started systemd-logind.service - User Login Management. Sep 4 15:47:33.914842 locksmithd[1655]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 4 15:47:33.922614 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Sep 4 15:49:06.299284 systemd-resolved[1533]: Clock change detected. Flushing caches. Sep 4 15:49:06.299571 systemd-timesyncd[1539]: Contacted time server 104.131.155.175:123 (0.flatcar.pool.ntp.org). Sep 4 15:49:06.299858 systemd-timesyncd[1539]: Initial clock synchronization to Thu 2025-09-04 15:49:06.299251 UTC. Sep 4 15:49:06.434804 containerd[1644]: time="2025-09-04T15:49:06Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 4 15:49:06.435152 containerd[1644]: time="2025-09-04T15:49:06.435136129Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451556700Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.134µs" Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451587788Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451605122Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451711755Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451724671Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451743265Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451782716Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451790436Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451971652Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.451985146Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.452000208Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 15:49:06.452132 containerd[1644]: time="2025-09-04T15:49:06.452008357Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 4 15:49:06.452364 containerd[1644]: time="2025-09-04T15:49:06.452072716Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 4 15:49:06.459363 containerd[1644]: time="2025-09-04T15:49:06.459139431Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 15:49:06.459363 containerd[1644]: time="2025-09-04T15:49:06.459194201Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 15:49:06.459363 containerd[1644]: time="2025-09-04T15:49:06.459205088Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 4 15:49:06.459363 containerd[1644]: time="2025-09-04T15:49:06.459234078Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 4 15:49:06.462810 containerd[1644]: time="2025-09-04T15:49:06.462681504Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 4 15:49:06.462810 containerd[1644]: time="2025-09-04T15:49:06.462740468Z" level=info msg="metadata content store policy set" policy=shared Sep 4 15:49:06.463673 sshd_keygen[1636]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 4 15:49:06.466852 containerd[1644]: time="2025-09-04T15:49:06.466831562Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 4 15:49:06.466929 containerd[1644]: time="2025-09-04T15:49:06.466920900Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 4 15:49:06.466966 containerd[1644]: time="2025-09-04T15:49:06.466959215Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 4 15:49:06.466999 containerd[1644]: time="2025-09-04T15:49:06.466992378Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 4 15:49:06.467041 containerd[1644]: time="2025-09-04T15:49:06.467033385Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 4 15:49:06.467072 containerd[1644]: time="2025-09-04T15:49:06.467065841Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 4 15:49:06.467105 containerd[1644]: time="2025-09-04T15:49:06.467098034Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 4 15:49:06.467174 containerd[1644]: time="2025-09-04T15:49:06.467165866Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 4 15:49:06.467209 containerd[1644]: time="2025-09-04T15:49:06.467202302Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 4 15:49:06.467240 containerd[1644]: time="2025-09-04T15:49:06.467233481Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 4 15:49:06.467273 containerd[1644]: time="2025-09-04T15:49:06.467266772Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467829498Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467910694Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467924303Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467935873Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467943352Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467950347Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467956402Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467962992Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467968945Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467976219Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467982298Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.467988070Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.468026072Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.468034181Z" level=info msg="Start snapshots syncer" Sep 4 15:49:06.468136 containerd[1644]: time="2025-09-04T15:49:06.468050808Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 4 15:49:06.468484 containerd[1644]: time="2025-09-04T15:49:06.468227046Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 4 15:49:06.468484 containerd[1644]: time="2025-09-04T15:49:06.468255689Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468294233Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468343924Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468356099Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468362156Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468368689Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468375442Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468381749Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468387683Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468399859Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468410171Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468419132Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468435607Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468445034Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 15:49:06.468607 containerd[1644]: time="2025-09-04T15:49:06.468450046Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468455024Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468459647Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468465019Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468470680Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468479170Z" level=info msg="runtime interface created" Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468482052Z" level=info msg="created NRI interface" Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468487691Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468493994Z" level=info msg="Connect containerd service" Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468508252Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 4 15:49:06.470796 containerd[1644]: time="2025-09-04T15:49:06.468870256Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 15:49:06.494243 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 4 15:49:06.497076 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 4 15:49:06.518090 systemd[1]: issuegen.service: Deactivated successfully. Sep 4 15:49:06.518280 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 4 15:49:06.522438 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 4 15:49:06.547185 tar[1630]: linux-amd64/LICENSE Sep 4 15:49:06.547185 tar[1630]: linux-amd64/README.md Sep 4 15:49:06.552363 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 4 15:49:06.553151 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 4 15:49:06.555270 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 4 15:49:06.558233 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 4 15:49:06.558446 systemd[1]: Reached target getty.target - Login Prompts. Sep 4 15:49:06.571224 containerd[1644]: time="2025-09-04T15:49:06.571131675Z" level=info msg="Start subscribing containerd event" Sep 4 15:49:06.571224 containerd[1644]: time="2025-09-04T15:49:06.571163554Z" level=info msg="Start recovering state" Sep 4 15:49:06.571319 containerd[1644]: time="2025-09-04T15:49:06.571238420Z" level=info msg="Start event monitor" Sep 4 15:49:06.571319 containerd[1644]: time="2025-09-04T15:49:06.571250771Z" level=info msg="Start cni network conf syncer for default" Sep 4 15:49:06.571319 containerd[1644]: time="2025-09-04T15:49:06.571255440Z" level=info msg="Start streaming server" Sep 4 15:49:06.571319 containerd[1644]: time="2025-09-04T15:49:06.571262242Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 4 15:49:06.571319 containerd[1644]: time="2025-09-04T15:49:06.571267996Z" level=info msg="runtime interface starting up..." Sep 4 15:49:06.571319 containerd[1644]: time="2025-09-04T15:49:06.571271384Z" level=info msg="starting plugins..." Sep 4 15:49:06.571319 containerd[1644]: time="2025-09-04T15:49:06.571280178Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 4 15:49:06.571465 containerd[1644]: time="2025-09-04T15:49:06.571454690Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 4 15:49:06.571760 containerd[1644]: time="2025-09-04T15:49:06.571529088Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 4 15:49:06.571634 systemd[1]: Started containerd.service - containerd container runtime. Sep 4 15:49:06.572320 containerd[1644]: time="2025-09-04T15:49:06.572140954Z" level=info msg="containerd successfully booted in 0.137739s" Sep 4 15:49:07.110296 systemd-networkd[1532]: ens192: Gained IPv6LL Sep 4 15:49:07.111620 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 4 15:49:07.112441 systemd[1]: Reached target network-online.target - Network is Online. Sep 4 15:49:07.113728 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Sep 4 15:49:07.116272 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:07.123182 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 4 15:49:07.146095 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 4 15:49:07.155153 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 4 15:49:07.155301 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Sep 4 15:49:07.155969 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 4 15:49:07.886079 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:07.886587 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 4 15:49:07.887342 systemd[1]: Startup finished in 2.710s (kernel) + 8.803s (initrd) + 4.234s (userspace) = 15.749s. Sep 4 15:49:07.903169 (kubelet)[1803]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:49:07.929857 login[1765]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 15:49:07.932644 login[1767]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 15:49:07.935043 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 4 15:49:07.936083 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 4 15:49:07.942045 systemd-logind[1618]: New session 2 of user core. Sep 4 15:49:07.948531 systemd-logind[1618]: New session 1 of user core. Sep 4 15:49:07.952648 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 4 15:49:07.955232 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 4 15:49:07.970634 (systemd)[1810]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 4 15:49:07.972541 systemd-logind[1618]: New session c1 of user core. Sep 4 15:49:08.061696 systemd[1810]: Queued start job for default target default.target. Sep 4 15:49:08.067910 systemd[1810]: Created slice app.slice - User Application Slice. Sep 4 15:49:08.067927 systemd[1810]: Reached target paths.target - Paths. Sep 4 15:49:08.067954 systemd[1810]: Reached target timers.target - Timers. Sep 4 15:49:08.068917 systemd[1810]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 4 15:49:08.078963 systemd[1810]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 4 15:49:08.079079 systemd[1810]: Reached target sockets.target - Sockets. Sep 4 15:49:08.079162 systemd[1810]: Reached target basic.target - Basic System. Sep 4 15:49:08.079237 systemd[1810]: Reached target default.target - Main User Target. Sep 4 15:49:08.079260 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 4 15:49:08.079882 systemd[1810]: Startup finished in 102ms. Sep 4 15:49:08.087195 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 4 15:49:08.087793 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 4 15:49:08.378135 kubelet[1803]: E0904 15:49:08.378035 1803 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:49:08.379484 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:49:08.379571 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:49:08.379955 systemd[1]: kubelet.service: Consumed 610ms CPU time, 263.8M memory peak. Sep 4 15:49:18.401698 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 4 15:49:18.402930 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:19.085160 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:19.092377 (kubelet)[1857]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:49:19.146842 kubelet[1857]: E0904 15:49:19.146804 1857 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:49:19.149399 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:49:19.149556 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:49:19.149935 systemd[1]: kubelet.service: Consumed 104ms CPU time, 108.8M memory peak. Sep 4 15:49:29.151657 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 4 15:49:29.152931 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:29.340717 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:29.343092 (kubelet)[1872]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:49:29.382886 kubelet[1872]: E0904 15:49:29.382855 1872 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:49:29.384372 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:49:29.384461 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:49:29.384669 systemd[1]: kubelet.service: Consumed 107ms CPU time, 110.4M memory peak. Sep 4 15:49:36.310526 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 4 15:49:36.311477 systemd[1]: Started sshd@0-139.178.70.104:22-139.178.89.65:38134.service - OpenSSH per-connection server daemon (139.178.89.65:38134). Sep 4 15:49:36.356393 sshd[1880]: Accepted publickey for core from 139.178.89.65 port 38134 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:49:36.357155 sshd-session[1880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:49:36.360272 systemd-logind[1618]: New session 3 of user core. Sep 4 15:49:36.370273 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 4 15:49:36.422806 systemd[1]: Started sshd@1-139.178.70.104:22-139.178.89.65:38138.service - OpenSSH per-connection server daemon (139.178.89.65:38138). Sep 4 15:49:36.460191 sshd[1886]: Accepted publickey for core from 139.178.89.65 port 38138 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:49:36.461068 sshd-session[1886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:49:36.464033 systemd-logind[1618]: New session 4 of user core. Sep 4 15:49:36.466204 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 4 15:49:36.515168 sshd[1889]: Connection closed by 139.178.89.65 port 38138 Sep 4 15:49:36.515856 sshd-session[1886]: pam_unix(sshd:session): session closed for user core Sep 4 15:49:36.520154 systemd[1]: sshd@1-139.178.70.104:22-139.178.89.65:38138.service: Deactivated successfully. Sep 4 15:49:36.520939 systemd[1]: session-4.scope: Deactivated successfully. Sep 4 15:49:36.521417 systemd-logind[1618]: Session 4 logged out. Waiting for processes to exit. Sep 4 15:49:36.522494 systemd[1]: Started sshd@2-139.178.70.104:22-139.178.89.65:38144.service - OpenSSH per-connection server daemon (139.178.89.65:38144). Sep 4 15:49:36.523235 systemd-logind[1618]: Removed session 4. Sep 4 15:49:36.553392 sshd[1895]: Accepted publickey for core from 139.178.89.65 port 38144 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:49:36.554194 sshd-session[1895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:49:36.556636 systemd-logind[1618]: New session 5 of user core. Sep 4 15:49:36.565238 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 4 15:49:36.611648 sshd[1898]: Connection closed by 139.178.89.65 port 38144 Sep 4 15:49:36.611964 sshd-session[1895]: pam_unix(sshd:session): session closed for user core Sep 4 15:49:36.618904 systemd[1]: sshd@2-139.178.70.104:22-139.178.89.65:38144.service: Deactivated successfully. Sep 4 15:49:36.619955 systemd[1]: session-5.scope: Deactivated successfully. Sep 4 15:49:36.620969 systemd-logind[1618]: Session 5 logged out. Waiting for processes to exit. Sep 4 15:49:36.621683 systemd[1]: Started sshd@3-139.178.70.104:22-139.178.89.65:38146.service - OpenSSH per-connection server daemon (139.178.89.65:38146). Sep 4 15:49:36.623296 systemd-logind[1618]: Removed session 5. Sep 4 15:49:36.651988 sshd[1904]: Accepted publickey for core from 139.178.89.65 port 38146 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:49:36.652822 sshd-session[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:49:36.656154 systemd-logind[1618]: New session 6 of user core. Sep 4 15:49:36.665281 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 4 15:49:36.712837 sshd[1907]: Connection closed by 139.178.89.65 port 38146 Sep 4 15:49:36.713111 sshd-session[1904]: pam_unix(sshd:session): session closed for user core Sep 4 15:49:36.721021 systemd[1]: sshd@3-139.178.70.104:22-139.178.89.65:38146.service: Deactivated successfully. Sep 4 15:49:36.721954 systemd[1]: session-6.scope: Deactivated successfully. Sep 4 15:49:36.722434 systemd-logind[1618]: Session 6 logged out. Waiting for processes to exit. Sep 4 15:49:36.723500 systemd[1]: Started sshd@4-139.178.70.104:22-139.178.89.65:38162.service - OpenSSH per-connection server daemon (139.178.89.65:38162). Sep 4 15:49:36.725266 systemd-logind[1618]: Removed session 6. Sep 4 15:49:36.752154 sshd[1913]: Accepted publickey for core from 139.178.89.65 port 38162 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:49:36.752862 sshd-session[1913]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:49:36.755364 systemd-logind[1618]: New session 7 of user core. Sep 4 15:49:36.765195 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 4 15:49:36.872905 sudo[1917]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 4 15:49:36.873138 sudo[1917]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:49:36.884585 sudo[1917]: pam_unix(sudo:session): session closed for user root Sep 4 15:49:36.885548 sshd[1916]: Connection closed by 139.178.89.65 port 38162 Sep 4 15:49:36.886401 sshd-session[1913]: pam_unix(sshd:session): session closed for user core Sep 4 15:49:36.893780 systemd[1]: sshd@4-139.178.70.104:22-139.178.89.65:38162.service: Deactivated successfully. Sep 4 15:49:36.895024 systemd[1]: session-7.scope: Deactivated successfully. Sep 4 15:49:36.895938 systemd-logind[1618]: Session 7 logged out. Waiting for processes to exit. Sep 4 15:49:36.898494 systemd[1]: Started sshd@5-139.178.70.104:22-139.178.89.65:38164.service - OpenSSH per-connection server daemon (139.178.89.65:38164). Sep 4 15:49:36.899311 systemd-logind[1618]: Removed session 7. Sep 4 15:49:36.931569 sshd[1923]: Accepted publickey for core from 139.178.89.65 port 38164 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:49:36.932422 sshd-session[1923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:49:36.936158 systemd-logind[1618]: New session 8 of user core. Sep 4 15:49:36.945217 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 4 15:49:36.996036 sudo[1928]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 4 15:49:36.996256 sudo[1928]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:49:37.000651 sudo[1928]: pam_unix(sudo:session): session closed for user root Sep 4 15:49:37.004553 sudo[1927]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 4 15:49:37.004925 sudo[1927]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:49:37.012828 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 15:49:37.039808 augenrules[1950]: No rules Sep 4 15:49:37.040465 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 15:49:37.040735 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 15:49:37.041560 sudo[1927]: pam_unix(sudo:session): session closed for user root Sep 4 15:49:37.042490 sshd[1926]: Connection closed by 139.178.89.65 port 38164 Sep 4 15:49:37.043598 sshd-session[1923]: pam_unix(sshd:session): session closed for user core Sep 4 15:49:37.048228 systemd[1]: sshd@5-139.178.70.104:22-139.178.89.65:38164.service: Deactivated successfully. Sep 4 15:49:37.049153 systemd[1]: session-8.scope: Deactivated successfully. Sep 4 15:49:37.049687 systemd-logind[1618]: Session 8 logged out. Waiting for processes to exit. Sep 4 15:49:37.050926 systemd[1]: Started sshd@6-139.178.70.104:22-139.178.89.65:38174.service - OpenSSH per-connection server daemon (139.178.89.65:38174). Sep 4 15:49:37.052358 systemd-logind[1618]: Removed session 8. Sep 4 15:49:37.083402 sshd[1959]: Accepted publickey for core from 139.178.89.65 port 38174 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:49:37.084281 sshd-session[1959]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:49:37.087150 systemd-logind[1618]: New session 9 of user core. Sep 4 15:49:37.098293 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 4 15:49:37.146634 sudo[1963]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 4 15:49:37.147494 sudo[1963]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:49:37.473502 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 4 15:49:37.482461 (dockerd)[1981]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 4 15:49:37.777583 dockerd[1981]: time="2025-09-04T15:49:37.777511966Z" level=info msg="Starting up" Sep 4 15:49:37.778375 dockerd[1981]: time="2025-09-04T15:49:37.778360418Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 4 15:49:37.786761 dockerd[1981]: time="2025-09-04T15:49:37.786724757Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 4 15:49:37.808922 dockerd[1981]: time="2025-09-04T15:49:37.808899454Z" level=info msg="Loading containers: start." Sep 4 15:49:37.817134 kernel: Initializing XFRM netlink socket Sep 4 15:49:37.958632 systemd-networkd[1532]: docker0: Link UP Sep 4 15:49:37.959597 dockerd[1981]: time="2025-09-04T15:49:37.959575210Z" level=info msg="Loading containers: done." Sep 4 15:49:37.968052 dockerd[1981]: time="2025-09-04T15:49:37.968022626Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 4 15:49:37.968159 dockerd[1981]: time="2025-09-04T15:49:37.968081582Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 4 15:49:37.968159 dockerd[1981]: time="2025-09-04T15:49:37.968138499Z" level=info msg="Initializing buildkit" Sep 4 15:49:37.977710 dockerd[1981]: time="2025-09-04T15:49:37.977687933Z" level=info msg="Completed buildkit initialization" Sep 4 15:49:37.984496 dockerd[1981]: time="2025-09-04T15:49:37.984471772Z" level=info msg="Daemon has completed initialization" Sep 4 15:49:37.984542 dockerd[1981]: time="2025-09-04T15:49:37.984515117Z" level=info msg="API listen on /run/docker.sock" Sep 4 15:49:37.984646 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 4 15:49:38.736978 containerd[1644]: time="2025-09-04T15:49:38.736953825Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 4 15:49:39.266030 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1234109353.mount: Deactivated successfully. Sep 4 15:49:39.401890 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 4 15:49:39.404217 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:39.759896 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:39.763387 (kubelet)[2250]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:49:39.799959 kubelet[2250]: E0904 15:49:39.799912 2250 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:49:39.800801 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:49:39.800887 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:49:39.801192 systemd[1]: kubelet.service: Consumed 103ms CPU time, 107.6M memory peak. Sep 4 15:49:40.430929 containerd[1644]: time="2025-09-04T15:49:40.430905826Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:40.431789 containerd[1644]: time="2025-09-04T15:49:40.431775991Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079631" Sep 4 15:49:40.432167 containerd[1644]: time="2025-09-04T15:49:40.432147752Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:40.434123 containerd[1644]: time="2025-09-04T15:49:40.433631829Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:40.434295 containerd[1644]: time="2025-09-04T15:49:40.434111124Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 1.697136066s" Sep 4 15:49:40.434340 containerd[1644]: time="2025-09-04T15:49:40.434332190Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Sep 4 15:49:40.434718 containerd[1644]: time="2025-09-04T15:49:40.434709779Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 4 15:49:41.807779 containerd[1644]: time="2025-09-04T15:49:41.807739891Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:41.814058 containerd[1644]: time="2025-09-04T15:49:41.814041444Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714681" Sep 4 15:49:41.819433 containerd[1644]: time="2025-09-04T15:49:41.819404074Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:41.826845 containerd[1644]: time="2025-09-04T15:49:41.826793161Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:41.827600 containerd[1644]: time="2025-09-04T15:49:41.827499591Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 1.392731785s" Sep 4 15:49:41.827600 containerd[1644]: time="2025-09-04T15:49:41.827520544Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Sep 4 15:49:41.828013 containerd[1644]: time="2025-09-04T15:49:41.827956206Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 4 15:49:43.038280 containerd[1644]: time="2025-09-04T15:49:43.037814682Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:43.038744 containerd[1644]: time="2025-09-04T15:49:43.038733054Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782427" Sep 4 15:49:43.039012 containerd[1644]: time="2025-09-04T15:49:43.039001271Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:43.040408 containerd[1644]: time="2025-09-04T15:49:43.040396296Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:43.040944 containerd[1644]: time="2025-09-04T15:49:43.040933240Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 1.212866039s" Sep 4 15:49:43.040990 containerd[1644]: time="2025-09-04T15:49:43.040982597Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Sep 4 15:49:43.041370 containerd[1644]: time="2025-09-04T15:49:43.041352647Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 4 15:49:43.971023 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2633882695.mount: Deactivated successfully. Sep 4 15:49:44.295793 containerd[1644]: time="2025-09-04T15:49:44.295300881Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:44.300452 containerd[1644]: time="2025-09-04T15:49:44.300428206Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384255" Sep 4 15:49:44.306360 containerd[1644]: time="2025-09-04T15:49:44.306330263Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:44.314507 containerd[1644]: time="2025-09-04T15:49:44.314480635Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:44.314897 containerd[1644]: time="2025-09-04T15:49:44.314767400Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 1.273346077s" Sep 4 15:49:44.314897 containerd[1644]: time="2025-09-04T15:49:44.314791123Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Sep 4 15:49:44.315147 containerd[1644]: time="2025-09-04T15:49:44.315130499Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 4 15:49:44.872428 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1476657259.mount: Deactivated successfully. Sep 4 15:49:45.567883 containerd[1644]: time="2025-09-04T15:49:45.567831544Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:45.568351 containerd[1644]: time="2025-09-04T15:49:45.568338518Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 4 15:49:45.568824 containerd[1644]: time="2025-09-04T15:49:45.568612165Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:45.569920 containerd[1644]: time="2025-09-04T15:49:45.569908682Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:45.570488 containerd[1644]: time="2025-09-04T15:49:45.570472215Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.255319972s" Sep 4 15:49:45.570519 containerd[1644]: time="2025-09-04T15:49:45.570489721Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 4 15:49:45.570941 containerd[1644]: time="2025-09-04T15:49:45.570928595Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 4 15:49:46.093735 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1715962204.mount: Deactivated successfully. Sep 4 15:49:46.096048 containerd[1644]: time="2025-09-04T15:49:46.095722407Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:49:46.096296 containerd[1644]: time="2025-09-04T15:49:46.096286611Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 4 15:49:46.096685 containerd[1644]: time="2025-09-04T15:49:46.096674006Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:49:46.097635 containerd[1644]: time="2025-09-04T15:49:46.097616563Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:49:46.098045 containerd[1644]: time="2025-09-04T15:49:46.098032554Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 527.089177ms" Sep 4 15:49:46.098100 containerd[1644]: time="2025-09-04T15:49:46.098092124Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 4 15:49:46.098464 containerd[1644]: time="2025-09-04T15:49:46.098454750Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 4 15:49:46.689876 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2534964941.mount: Deactivated successfully. Sep 4 15:49:48.760750 containerd[1644]: time="2025-09-04T15:49:48.760294027Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:48.760750 containerd[1644]: time="2025-09-04T15:49:48.760706515Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 4 15:49:48.761009 containerd[1644]: time="2025-09-04T15:49:48.760906622Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:48.762614 containerd[1644]: time="2025-09-04T15:49:48.762602721Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:49:48.763729 containerd[1644]: time="2025-09-04T15:49:48.763717557Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.665201804s" Sep 4 15:49:48.763796 containerd[1644]: time="2025-09-04T15:49:48.763786193Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 4 15:49:49.901850 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 4 15:49:49.903154 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:50.213381 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:50.221287 (kubelet)[2414]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:49:50.277453 kubelet[2414]: E0904 15:49:50.277421 2414 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:49:50.279311 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:49:50.279394 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:49:50.280316 systemd[1]: kubelet.service: Consumed 93ms CPU time, 109.2M memory peak. Sep 4 15:49:51.151952 update_engine[1621]: I20250904 15:49:51.151153 1621 update_attempter.cc:509] Updating boot flags... Sep 4 15:49:51.298974 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:51.299089 systemd[1]: kubelet.service: Consumed 93ms CPU time, 109.2M memory peak. Sep 4 15:49:51.301313 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:51.322765 systemd[1]: Reload requested from client PID 2448 ('systemctl') (unit session-9.scope)... Sep 4 15:49:51.322774 systemd[1]: Reloading... Sep 4 15:49:51.397139 zram_generator::config[2495]: No configuration found. Sep 4 15:49:51.469098 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:49:51.537291 systemd[1]: Reloading finished in 214 ms. Sep 4 15:49:51.574979 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 4 15:49:51.575040 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 4 15:49:51.575243 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:51.576420 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:51.911997 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:51.916357 (kubelet)[2559]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 15:49:51.948620 kubelet[2559]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:49:51.948620 kubelet[2559]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 4 15:49:51.948620 kubelet[2559]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:49:51.948849 kubelet[2559]: I0904 15:49:51.948666 2559 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 15:49:52.496138 kubelet[2559]: I0904 15:49:52.495812 2559 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 4 15:49:52.496138 kubelet[2559]: I0904 15:49:52.495837 2559 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 15:49:52.496138 kubelet[2559]: I0904 15:49:52.496005 2559 server.go:934] "Client rotation is on, will bootstrap in background" Sep 4 15:49:52.523965 kubelet[2559]: I0904 15:49:52.523948 2559 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 15:49:52.525074 kubelet[2559]: E0904 15:49:52.524804 2559 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.104:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:52.538048 kubelet[2559]: I0904 15:49:52.538026 2559 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 15:49:52.541650 kubelet[2559]: I0904 15:49:52.541584 2559 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 15:49:52.543595 kubelet[2559]: I0904 15:49:52.543553 2559 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 4 15:49:52.544060 kubelet[2559]: I0904 15:49:52.543759 2559 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 15:49:52.544060 kubelet[2559]: I0904 15:49:52.543787 2559 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 15:49:52.544060 kubelet[2559]: I0904 15:49:52.543940 2559 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 15:49:52.544060 kubelet[2559]: I0904 15:49:52.543947 2559 container_manager_linux.go:300] "Creating device plugin manager" Sep 4 15:49:52.544421 kubelet[2559]: I0904 15:49:52.544414 2559 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:49:52.548035 kubelet[2559]: I0904 15:49:52.548010 2559 kubelet.go:408] "Attempting to sync node with API server" Sep 4 15:49:52.548160 kubelet[2559]: I0904 15:49:52.548154 2559 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 15:49:52.549364 kubelet[2559]: I0904 15:49:52.549349 2559 kubelet.go:314] "Adding apiserver pod source" Sep 4 15:49:52.549456 kubelet[2559]: I0904 15:49:52.549449 2559 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 15:49:52.552933 kubelet[2559]: W0904 15:49:52.552895 2559 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 4 15:49:52.553012 kubelet[2559]: E0904 15:49:52.552936 2559 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:52.554136 kubelet[2559]: W0904 15:49:52.553739 2559 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 4 15:49:52.554136 kubelet[2559]: E0904 15:49:52.553766 2559 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:52.554136 kubelet[2559]: I0904 15:49:52.553828 2559 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 15:49:52.556196 kubelet[2559]: I0904 15:49:52.556071 2559 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 15:49:52.556551 kubelet[2559]: W0904 15:49:52.556542 2559 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 4 15:49:52.556969 kubelet[2559]: I0904 15:49:52.556961 2559 server.go:1274] "Started kubelet" Sep 4 15:49:52.557906 kubelet[2559]: I0904 15:49:52.557529 2559 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 15:49:52.559003 kubelet[2559]: I0904 15:49:52.558941 2559 server.go:449] "Adding debug handlers to kubelet server" Sep 4 15:49:52.562073 kubelet[2559]: I0904 15:49:52.562050 2559 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 15:49:52.562241 kubelet[2559]: I0904 15:49:52.562232 2559 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 15:49:52.562308 kubelet[2559]: I0904 15:49:52.562238 2559 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 15:49:52.567787 kubelet[2559]: E0904 15:49:52.565103 2559 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.104:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.104:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18621f17ff5d7b9f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-04 15:49:52.556940191 +0000 UTC m=+0.637777550,LastTimestamp:2025-09-04 15:49:52.556940191 +0000 UTC m=+0.637777550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 4 15:49:52.567787 kubelet[2559]: I0904 15:49:52.567739 2559 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 4 15:49:52.567927 kubelet[2559]: I0904 15:49:52.562312 2559 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 15:49:52.567953 kubelet[2559]: I0904 15:49:52.567945 2559 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 4 15:49:52.568319 kubelet[2559]: I0904 15:49:52.567993 2559 reconciler.go:26] "Reconciler: start to sync state" Sep 4 15:49:52.568456 kubelet[2559]: W0904 15:49:52.568413 2559 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 4 15:49:52.568456 kubelet[2559]: E0904 15:49:52.568444 2559 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:52.568522 kubelet[2559]: E0904 15:49:52.568480 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:52.568704 kubelet[2559]: E0904 15:49:52.568676 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="200ms" Sep 4 15:49:52.575670 kubelet[2559]: I0904 15:49:52.575571 2559 factory.go:221] Registration of the systemd container factory successfully Sep 4 15:49:52.575670 kubelet[2559]: I0904 15:49:52.575640 2559 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 15:49:52.576663 kubelet[2559]: E0904 15:49:52.576462 2559 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 15:49:52.576795 kubelet[2559]: I0904 15:49:52.576786 2559 factory.go:221] Registration of the containerd container factory successfully Sep 4 15:49:52.588635 kubelet[2559]: I0904 15:49:52.588616 2559 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 4 15:49:52.588635 kubelet[2559]: I0904 15:49:52.588629 2559 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 4 15:49:52.588635 kubelet[2559]: I0904 15:49:52.588638 2559 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:49:52.589420 kubelet[2559]: I0904 15:49:52.589389 2559 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 15:49:52.590446 kubelet[2559]: I0904 15:49:52.590190 2559 policy_none.go:49] "None policy: Start" Sep 4 15:49:52.590575 kubelet[2559]: I0904 15:49:52.590562 2559 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 15:49:52.590882 kubelet[2559]: I0904 15:49:52.590862 2559 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 4 15:49:52.590953 kubelet[2559]: I0904 15:49:52.590946 2559 kubelet.go:2321] "Starting kubelet main sync loop" Sep 4 15:49:52.591249 kubelet[2559]: E0904 15:49:52.591187 2559 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 15:49:52.591249 kubelet[2559]: I0904 15:49:52.591053 2559 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 4 15:49:52.591249 kubelet[2559]: I0904 15:49:52.591213 2559 state_mem.go:35] "Initializing new in-memory state store" Sep 4 15:49:52.595546 kubelet[2559]: W0904 15:49:52.595384 2559 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 4 15:49:52.595546 kubelet[2559]: E0904 15:49:52.595412 2559 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:52.598938 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 4 15:49:52.608750 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 4 15:49:52.611357 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 4 15:49:52.618015 kubelet[2559]: I0904 15:49:52.617955 2559 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 15:49:52.618522 kubelet[2559]: I0904 15:49:52.618251 2559 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 15:49:52.618522 kubelet[2559]: I0904 15:49:52.618262 2559 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 15:49:52.618522 kubelet[2559]: I0904 15:49:52.618485 2559 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 15:49:52.619537 kubelet[2559]: E0904 15:49:52.619524 2559 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 4 15:49:52.700643 systemd[1]: Created slice kubepods-burstable-pod44e749fee0a39cea49d1cbc530d7723b.slice - libcontainer container kubepods-burstable-pod44e749fee0a39cea49d1cbc530d7723b.slice. Sep 4 15:49:52.720268 systemd[1]: Created slice kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice - libcontainer container kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice. Sep 4 15:49:52.723131 kubelet[2559]: I0904 15:49:52.723094 2559 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 15:49:52.723407 kubelet[2559]: E0904 15:49:52.723393 2559 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Sep 4 15:49:52.727064 systemd[1]: Created slice kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice - libcontainer container kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice. Sep 4 15:49:52.769728 kubelet[2559]: I0904 15:49:52.769549 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/44e749fee0a39cea49d1cbc530d7723b-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"44e749fee0a39cea49d1cbc530d7723b\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:49:52.769728 kubelet[2559]: I0904 15:49:52.769590 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/44e749fee0a39cea49d1cbc530d7723b-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"44e749fee0a39cea49d1cbc530d7723b\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:49:52.769728 kubelet[2559]: I0904 15:49:52.769610 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:52.769728 kubelet[2559]: I0904 15:49:52.769630 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:52.769728 kubelet[2559]: I0904 15:49:52.769646 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/44e749fee0a39cea49d1cbc530d7723b-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"44e749fee0a39cea49d1cbc530d7723b\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:49:52.769860 kubelet[2559]: I0904 15:49:52.769661 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:52.769860 kubelet[2559]: I0904 15:49:52.769706 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:52.769860 kubelet[2559]: I0904 15:49:52.769721 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:52.769860 kubelet[2559]: I0904 15:49:52.769736 2559 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 4 15:49:52.772027 kubelet[2559]: E0904 15:49:52.772001 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="400ms" Sep 4 15:49:52.924525 kubelet[2559]: I0904 15:49:52.924488 2559 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 15:49:52.924893 kubelet[2559]: E0904 15:49:52.924878 2559 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Sep 4 15:49:53.018919 containerd[1644]: time="2025-09-04T15:49:53.018886168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:44e749fee0a39cea49d1cbc530d7723b,Namespace:kube-system,Attempt:0,}" Sep 4 15:49:53.031202 containerd[1644]: time="2025-09-04T15:49:53.030835632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,}" Sep 4 15:49:53.034906 containerd[1644]: time="2025-09-04T15:49:53.034875087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,}" Sep 4 15:49:53.144839 containerd[1644]: time="2025-09-04T15:49:53.144620832Z" level=info msg="connecting to shim 193a2747dfe2702e138038dbdd73d1262c84f8f9237a5219731e429bc41be64e" address="unix:///run/containerd/s/9c153e442701a7985da901aad8ab6d1031a03be9d17a2bddd7950ffe0919aa5e" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:49:53.151241 containerd[1644]: time="2025-09-04T15:49:53.151172739Z" level=info msg="connecting to shim c655c3007cdf11667af3a10d85a073b5b9446197cc133e0d0197f9e19dd7891e" address="unix:///run/containerd/s/6ff4bc2e98c247a8a8e0bb7ee4e31c8bdb2297091216c5dadb46948ac56d584d" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:49:53.161815 containerd[1644]: time="2025-09-04T15:49:53.161505883Z" level=info msg="connecting to shim d1a726b3dc535408cb301136cf243b72a29614f2e23baed8fe6242ad71ffd50b" address="unix:///run/containerd/s/515ef9991eeae8249debfb18db8028cf6e527142f4f0fd0f7caec505daab2f03" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:49:53.172513 kubelet[2559]: E0904 15:49:53.172486 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="800ms" Sep 4 15:49:53.324288 systemd[1]: Started cri-containerd-193a2747dfe2702e138038dbdd73d1262c84f8f9237a5219731e429bc41be64e.scope - libcontainer container 193a2747dfe2702e138038dbdd73d1262c84f8f9237a5219731e429bc41be64e. Sep 4 15:49:53.326227 systemd[1]: Started cri-containerd-c655c3007cdf11667af3a10d85a073b5b9446197cc133e0d0197f9e19dd7891e.scope - libcontainer container c655c3007cdf11667af3a10d85a073b5b9446197cc133e0d0197f9e19dd7891e. Sep 4 15:49:53.327898 systemd[1]: Started cri-containerd-d1a726b3dc535408cb301136cf243b72a29614f2e23baed8fe6242ad71ffd50b.scope - libcontainer container d1a726b3dc535408cb301136cf243b72a29614f2e23baed8fe6242ad71ffd50b. Sep 4 15:49:53.335411 kubelet[2559]: I0904 15:49:53.335074 2559 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 15:49:53.335411 kubelet[2559]: E0904 15:49:53.335374 2559 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Sep 4 15:49:53.395033 kubelet[2559]: W0904 15:49:53.394998 2559 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 4 15:49:53.395033 kubelet[2559]: E0904 15:49:53.395035 2559 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:53.398735 containerd[1644]: time="2025-09-04T15:49:53.398714577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:44e749fee0a39cea49d1cbc530d7723b,Namespace:kube-system,Attempt:0,} returns sandbox id \"c655c3007cdf11667af3a10d85a073b5b9446197cc133e0d0197f9e19dd7891e\"" Sep 4 15:49:53.402054 containerd[1644]: time="2025-09-04T15:49:53.402026323Z" level=info msg="CreateContainer within sandbox \"c655c3007cdf11667af3a10d85a073b5b9446197cc133e0d0197f9e19dd7891e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 4 15:49:53.422493 containerd[1644]: time="2025-09-04T15:49:53.422469633Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,} returns sandbox id \"193a2747dfe2702e138038dbdd73d1262c84f8f9237a5219731e429bc41be64e\"" Sep 4 15:49:53.424094 containerd[1644]: time="2025-09-04T15:49:53.424072296Z" level=info msg="CreateContainer within sandbox \"193a2747dfe2702e138038dbdd73d1262c84f8f9237a5219731e429bc41be64e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 4 15:49:53.445996 containerd[1644]: time="2025-09-04T15:49:53.445960157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1a726b3dc535408cb301136cf243b72a29614f2e23baed8fe6242ad71ffd50b\"" Sep 4 15:49:53.447368 containerd[1644]: time="2025-09-04T15:49:53.447349320Z" level=info msg="CreateContainer within sandbox \"d1a726b3dc535408cb301136cf243b72a29614f2e23baed8fe6242ad71ffd50b\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 4 15:49:53.506905 containerd[1644]: time="2025-09-04T15:49:53.506876054Z" level=info msg="Container e3969df529fe21335dd80478ae77e903880a6039d5c118fc5dcfe3e8739d229e: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:49:53.508991 containerd[1644]: time="2025-09-04T15:49:53.508720372Z" level=info msg="Container a390900af61ab3836d2250d6516766d484547da6a9167758e50d99e863d1062e: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:49:53.509417 containerd[1644]: time="2025-09-04T15:49:53.509404003Z" level=info msg="Container c6795915587de7718fae8b66ea3d6402b93874fb92d0c43c9c1f3a6b0389cb58: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:49:53.519090 containerd[1644]: time="2025-09-04T15:49:53.519063807Z" level=info msg="CreateContainer within sandbox \"d1a726b3dc535408cb301136cf243b72a29614f2e23baed8fe6242ad71ffd50b\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a390900af61ab3836d2250d6516766d484547da6a9167758e50d99e863d1062e\"" Sep 4 15:49:53.519721 containerd[1644]: time="2025-09-04T15:49:53.519708606Z" level=info msg="StartContainer for \"a390900af61ab3836d2250d6516766d484547da6a9167758e50d99e863d1062e\"" Sep 4 15:49:53.521615 containerd[1644]: time="2025-09-04T15:49:53.521597438Z" level=info msg="CreateContainer within sandbox \"c655c3007cdf11667af3a10d85a073b5b9446197cc133e0d0197f9e19dd7891e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e3969df529fe21335dd80478ae77e903880a6039d5c118fc5dcfe3e8739d229e\"" Sep 4 15:49:53.522059 containerd[1644]: time="2025-09-04T15:49:53.522033661Z" level=info msg="CreateContainer within sandbox \"193a2747dfe2702e138038dbdd73d1262c84f8f9237a5219731e429bc41be64e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c6795915587de7718fae8b66ea3d6402b93874fb92d0c43c9c1f3a6b0389cb58\"" Sep 4 15:49:53.522396 containerd[1644]: time="2025-09-04T15:49:53.522321738Z" level=info msg="connecting to shim a390900af61ab3836d2250d6516766d484547da6a9167758e50d99e863d1062e" address="unix:///run/containerd/s/515ef9991eeae8249debfb18db8028cf6e527142f4f0fd0f7caec505daab2f03" protocol=ttrpc version=3 Sep 4 15:49:53.523140 kubelet[2559]: W0904 15:49:53.522556 2559 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 4 15:49:53.523561 containerd[1644]: time="2025-09-04T15:49:53.522660908Z" level=info msg="StartContainer for \"e3969df529fe21335dd80478ae77e903880a6039d5c118fc5dcfe3e8739d229e\"" Sep 4 15:49:53.523561 containerd[1644]: time="2025-09-04T15:49:53.523247285Z" level=info msg="connecting to shim e3969df529fe21335dd80478ae77e903880a6039d5c118fc5dcfe3e8739d229e" address="unix:///run/containerd/s/6ff4bc2e98c247a8a8e0bb7ee4e31c8bdb2297091216c5dadb46948ac56d584d" protocol=ttrpc version=3 Sep 4 15:49:53.523561 containerd[1644]: time="2025-09-04T15:49:53.523411168Z" level=info msg="StartContainer for \"c6795915587de7718fae8b66ea3d6402b93874fb92d0c43c9c1f3a6b0389cb58\"" Sep 4 15:49:53.523625 kubelet[2559]: E0904 15:49:53.523147 2559 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:53.524579 containerd[1644]: time="2025-09-04T15:49:53.524566795Z" level=info msg="connecting to shim c6795915587de7718fae8b66ea3d6402b93874fb92d0c43c9c1f3a6b0389cb58" address="unix:///run/containerd/s/9c153e442701a7985da901aad8ab6d1031a03be9d17a2bddd7950ffe0919aa5e" protocol=ttrpc version=3 Sep 4 15:49:53.540305 systemd[1]: Started cri-containerd-c6795915587de7718fae8b66ea3d6402b93874fb92d0c43c9c1f3a6b0389cb58.scope - libcontainer container c6795915587de7718fae8b66ea3d6402b93874fb92d0c43c9c1f3a6b0389cb58. Sep 4 15:49:53.548254 systemd[1]: Started cri-containerd-a390900af61ab3836d2250d6516766d484547da6a9167758e50d99e863d1062e.scope - libcontainer container a390900af61ab3836d2250d6516766d484547da6a9167758e50d99e863d1062e. Sep 4 15:49:53.550313 systemd[1]: Started cri-containerd-e3969df529fe21335dd80478ae77e903880a6039d5c118fc5dcfe3e8739d229e.scope - libcontainer container e3969df529fe21335dd80478ae77e903880a6039d5c118fc5dcfe3e8739d229e. Sep 4 15:49:53.593733 containerd[1644]: time="2025-09-04T15:49:53.593638147Z" level=info msg="StartContainer for \"e3969df529fe21335dd80478ae77e903880a6039d5c118fc5dcfe3e8739d229e\" returns successfully" Sep 4 15:49:53.617254 containerd[1644]: time="2025-09-04T15:49:53.617182670Z" level=info msg="StartContainer for \"c6795915587de7718fae8b66ea3d6402b93874fb92d0c43c9c1f3a6b0389cb58\" returns successfully" Sep 4 15:49:53.619868 containerd[1644]: time="2025-09-04T15:49:53.619839543Z" level=info msg="StartContainer for \"a390900af61ab3836d2250d6516766d484547da6a9167758e50d99e863d1062e\" returns successfully" Sep 4 15:49:53.804661 kubelet[2559]: W0904 15:49:53.804613 2559 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 4 15:49:53.804661 kubelet[2559]: E0904 15:49:53.804661 2559 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:53.973615 kubelet[2559]: E0904 15:49:53.973579 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="1.6s" Sep 4 15:49:54.039207 kubelet[2559]: W0904 15:49:54.039166 2559 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 4 15:49:54.039288 kubelet[2559]: E0904 15:49:54.039212 2559 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:49:54.136608 kubelet[2559]: I0904 15:49:54.136585 2559 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 15:49:54.828809 kubelet[2559]: I0904 15:49:54.828677 2559 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 4 15:49:54.828809 kubelet[2559]: E0904 15:49:54.828706 2559 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 4 15:49:54.837973 kubelet[2559]: E0904 15:49:54.836596 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:54.938255 kubelet[2559]: E0904 15:49:54.938220 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:55.038938 kubelet[2559]: E0904 15:49:55.038872 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:55.139490 kubelet[2559]: E0904 15:49:55.139412 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:55.240170 kubelet[2559]: E0904 15:49:55.240139 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:55.340598 kubelet[2559]: E0904 15:49:55.340563 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:55.441157 kubelet[2559]: E0904 15:49:55.441086 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:55.541823 kubelet[2559]: E0904 15:49:55.541788 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:55.642778 kubelet[2559]: E0904 15:49:55.642734 2559 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:56.555450 kubelet[2559]: I0904 15:49:56.555419 2559 apiserver.go:52] "Watching apiserver" Sep 4 15:49:56.568951 kubelet[2559]: I0904 15:49:56.568921 2559 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 4 15:49:57.056298 systemd[1]: Reload requested from client PID 2833 ('systemctl') (unit session-9.scope)... Sep 4 15:49:57.056312 systemd[1]: Reloading... Sep 4 15:49:57.114163 zram_generator::config[2879]: No configuration found. Sep 4 15:49:57.192167 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:49:57.268353 systemd[1]: Reloading finished in 211 ms. Sep 4 15:49:57.286860 kubelet[2559]: I0904 15:49:57.286825 2559 dynamic_cafile_content.go:174] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 15:49:57.286999 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:57.300317 systemd[1]: kubelet.service: Deactivated successfully. Sep 4 15:49:57.300519 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:57.300554 systemd[1]: kubelet.service: Consumed 790ms CPU time, 125.4M memory peak. Sep 4 15:49:57.302292 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:49:57.918103 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:49:57.928751 (kubelet)[2944]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 15:49:57.985228 kubelet[2944]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:49:57.985446 kubelet[2944]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 4 15:49:57.985475 kubelet[2944]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:49:57.985562 kubelet[2944]: I0904 15:49:57.985545 2944 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 15:49:57.989447 kubelet[2944]: I0904 15:49:57.989433 2944 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 4 15:49:57.989549 kubelet[2944]: I0904 15:49:57.989543 2944 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 15:49:57.989717 kubelet[2944]: I0904 15:49:57.989710 2944 server.go:934] "Client rotation is on, will bootstrap in background" Sep 4 15:49:57.990524 kubelet[2944]: I0904 15:49:57.990516 2944 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 4 15:49:57.995437 kubelet[2944]: I0904 15:49:57.995419 2944 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 15:49:57.997535 kubelet[2944]: I0904 15:49:57.997525 2944 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 15:49:57.999858 kubelet[2944]: I0904 15:49:57.999842 2944 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 15:49:57.999932 kubelet[2944]: I0904 15:49:57.999908 2944 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 4 15:49:57.999980 kubelet[2944]: I0904 15:49:57.999967 2944 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 15:49:58.000168 kubelet[2944]: I0904 15:49:57.999981 2944 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 15:49:58.000241 kubelet[2944]: I0904 15:49:58.000174 2944 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 15:49:58.000241 kubelet[2944]: I0904 15:49:58.000180 2944 container_manager_linux.go:300] "Creating device plugin manager" Sep 4 15:49:58.000241 kubelet[2944]: I0904 15:49:58.000197 2944 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:49:58.000290 kubelet[2944]: I0904 15:49:58.000255 2944 kubelet.go:408] "Attempting to sync node with API server" Sep 4 15:49:58.000290 kubelet[2944]: I0904 15:49:58.000262 2944 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 15:49:58.000290 kubelet[2944]: I0904 15:49:58.000279 2944 kubelet.go:314] "Adding apiserver pod source" Sep 4 15:49:58.000290 kubelet[2944]: I0904 15:49:58.000285 2944 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 15:49:58.003136 kubelet[2944]: I0904 15:49:58.001704 2944 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 15:49:58.003136 kubelet[2944]: I0904 15:49:58.002030 2944 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 15:49:58.003136 kubelet[2944]: I0904 15:49:58.002328 2944 server.go:1274] "Started kubelet" Sep 4 15:49:58.003976 kubelet[2944]: I0904 15:49:58.003952 2944 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 15:49:58.004125 kubelet[2944]: I0904 15:49:58.004011 2944 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 15:49:58.004264 kubelet[2944]: I0904 15:49:58.004253 2944 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 15:49:58.005015 kubelet[2944]: I0904 15:49:58.005006 2944 server.go:449] "Adding debug handlers to kubelet server" Sep 4 15:49:58.005931 kubelet[2944]: I0904 15:49:58.005920 2944 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 15:49:58.007516 kubelet[2944]: I0904 15:49:58.007505 2944 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 15:49:58.017638 kubelet[2944]: I0904 15:49:58.017616 2944 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 4 15:49:58.017766 kubelet[2944]: E0904 15:49:58.017752 2944 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:49:58.018271 kubelet[2944]: I0904 15:49:58.018263 2944 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 4 15:49:58.019371 kubelet[2944]: I0904 15:49:58.019354 2944 reconciler.go:26] "Reconciler: start to sync state" Sep 4 15:49:58.020396 kubelet[2944]: I0904 15:49:58.020294 2944 factory.go:221] Registration of the systemd container factory successfully Sep 4 15:49:58.020541 kubelet[2944]: I0904 15:49:58.020520 2944 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 15:49:58.022136 kubelet[2944]: I0904 15:49:58.022125 2944 factory.go:221] Registration of the containerd container factory successfully Sep 4 15:49:58.023135 kubelet[2944]: I0904 15:49:58.023094 2944 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 15:49:58.024860 kubelet[2944]: I0904 15:49:58.024844 2944 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 15:49:58.024860 kubelet[2944]: I0904 15:49:58.024859 2944 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 4 15:49:58.024935 kubelet[2944]: I0904 15:49:58.024870 2944 kubelet.go:2321] "Starting kubelet main sync loop" Sep 4 15:49:58.024935 kubelet[2944]: E0904 15:49:58.024894 2944 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 15:49:58.059308 kubelet[2944]: I0904 15:49:58.059243 2944 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 4 15:49:58.059308 kubelet[2944]: I0904 15:49:58.059255 2944 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 4 15:49:58.059308 kubelet[2944]: I0904 15:49:58.059266 2944 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:49:58.059531 kubelet[2944]: I0904 15:49:58.059367 2944 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 4 15:49:58.059531 kubelet[2944]: I0904 15:49:58.059374 2944 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 4 15:49:58.059531 kubelet[2944]: I0904 15:49:58.059387 2944 policy_none.go:49] "None policy: Start" Sep 4 15:49:58.059761 kubelet[2944]: I0904 15:49:58.059737 2944 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 4 15:49:58.059761 kubelet[2944]: I0904 15:49:58.059752 2944 state_mem.go:35] "Initializing new in-memory state store" Sep 4 15:49:58.059898 kubelet[2944]: I0904 15:49:58.059886 2944 state_mem.go:75] "Updated machine memory state" Sep 4 15:49:58.063393 kubelet[2944]: I0904 15:49:58.062787 2944 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 15:49:58.063393 kubelet[2944]: I0904 15:49:58.062886 2944 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 15:49:58.063393 kubelet[2944]: I0904 15:49:58.062892 2944 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 15:49:58.065436 kubelet[2944]: I0904 15:49:58.065111 2944 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 15:49:58.164106 kubelet[2944]: I0904 15:49:58.164083 2944 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 15:49:58.168252 kubelet[2944]: I0904 15:49:58.168181 2944 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 4 15:49:58.168328 kubelet[2944]: I0904 15:49:58.168263 2944 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 4 15:49:58.220086 kubelet[2944]: I0904 15:49:58.220061 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 4 15:49:58.220086 kubelet[2944]: I0904 15:49:58.220085 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/44e749fee0a39cea49d1cbc530d7723b-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"44e749fee0a39cea49d1cbc530d7723b\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:49:58.220249 kubelet[2944]: I0904 15:49:58.220100 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/44e749fee0a39cea49d1cbc530d7723b-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"44e749fee0a39cea49d1cbc530d7723b\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:49:58.220249 kubelet[2944]: I0904 15:49:58.220110 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:58.220249 kubelet[2944]: I0904 15:49:58.220136 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:58.220249 kubelet[2944]: I0904 15:49:58.220150 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/44e749fee0a39cea49d1cbc530d7723b-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"44e749fee0a39cea49d1cbc530d7723b\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:49:58.220249 kubelet[2944]: I0904 15:49:58.220164 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:58.220335 kubelet[2944]: I0904 15:49:58.220173 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:58.220335 kubelet[2944]: I0904 15:49:58.220182 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:49:59.001593 kubelet[2944]: I0904 15:49:59.001570 2944 apiserver.go:52] "Watching apiserver" Sep 4 15:49:59.020024 kubelet[2944]: I0904 15:49:59.019997 2944 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 4 15:49:59.055473 kubelet[2944]: I0904 15:49:59.055409 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.055393557 podStartE2EDuration="1.055393557s" podCreationTimestamp="2025-09-04 15:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:49:59.055276583 +0000 UTC m=+1.119655577" watchObservedRunningTime="2025-09-04 15:49:59.055393557 +0000 UTC m=+1.119772547" Sep 4 15:49:59.063132 kubelet[2944]: I0904 15:49:59.063045 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.063033161 podStartE2EDuration="1.063033161s" podCreationTimestamp="2025-09-04 15:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:49:59.059070751 +0000 UTC m=+1.123449748" watchObservedRunningTime="2025-09-04 15:49:59.063033161 +0000 UTC m=+1.127412158" Sep 4 15:49:59.066915 kubelet[2944]: I0904 15:49:59.066878 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.066867535 podStartE2EDuration="1.066867535s" podCreationTimestamp="2025-09-04 15:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:49:59.063328536 +0000 UTC m=+1.127707530" watchObservedRunningTime="2025-09-04 15:49:59.066867535 +0000 UTC m=+1.131246528" Sep 4 15:50:02.250383 kubelet[2944]: I0904 15:50:02.250359 2944 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 4 15:50:02.250882 containerd[1644]: time="2025-09-04T15:50:02.250822101Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 4 15:50:02.251112 kubelet[2944]: I0904 15:50:02.250929 2944 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 4 15:50:03.045736 systemd[1]: Created slice kubepods-besteffort-podd0b0e982_427f_4f85_bfc2_dcb3eacbe934.slice - libcontainer container kubepods-besteffort-podd0b0e982_427f_4f85_bfc2_dcb3eacbe934.slice. Sep 4 15:50:03.050647 kubelet[2944]: I0904 15:50:03.050561 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d0b0e982-427f-4f85-bfc2-dcb3eacbe934-kube-proxy\") pod \"kube-proxy-46x6b\" (UID: \"d0b0e982-427f-4f85-bfc2-dcb3eacbe934\") " pod="kube-system/kube-proxy-46x6b" Sep 4 15:50:03.050647 kubelet[2944]: I0904 15:50:03.050586 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d0b0e982-427f-4f85-bfc2-dcb3eacbe934-xtables-lock\") pod \"kube-proxy-46x6b\" (UID: \"d0b0e982-427f-4f85-bfc2-dcb3eacbe934\") " pod="kube-system/kube-proxy-46x6b" Sep 4 15:50:03.050647 kubelet[2944]: I0904 15:50:03.050597 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72ll\" (UniqueName: \"kubernetes.io/projected/d0b0e982-427f-4f85-bfc2-dcb3eacbe934-kube-api-access-s72ll\") pod \"kube-proxy-46x6b\" (UID: \"d0b0e982-427f-4f85-bfc2-dcb3eacbe934\") " pod="kube-system/kube-proxy-46x6b" Sep 4 15:50:03.050647 kubelet[2944]: I0904 15:50:03.050608 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0b0e982-427f-4f85-bfc2-dcb3eacbe934-lib-modules\") pod \"kube-proxy-46x6b\" (UID: \"d0b0e982-427f-4f85-bfc2-dcb3eacbe934\") " pod="kube-system/kube-proxy-46x6b" Sep 4 15:50:03.320546 systemd[1]: Created slice kubepods-besteffort-pod2e64fef4_a130_4692_9b18_4ddacb28b8d5.slice - libcontainer container kubepods-besteffort-pod2e64fef4_a130_4692_9b18_4ddacb28b8d5.slice. Sep 4 15:50:03.352332 kubelet[2944]: I0904 15:50:03.352265 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z9nf\" (UniqueName: \"kubernetes.io/projected/2e64fef4-a130-4692-9b18-4ddacb28b8d5-kube-api-access-7z9nf\") pod \"tigera-operator-58fc44c59b-2c62x\" (UID: \"2e64fef4-a130-4692-9b18-4ddacb28b8d5\") " pod="tigera-operator/tigera-operator-58fc44c59b-2c62x" Sep 4 15:50:03.352332 kubelet[2944]: I0904 15:50:03.352293 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2e64fef4-a130-4692-9b18-4ddacb28b8d5-var-lib-calico\") pod \"tigera-operator-58fc44c59b-2c62x\" (UID: \"2e64fef4-a130-4692-9b18-4ddacb28b8d5\") " pod="tigera-operator/tigera-operator-58fc44c59b-2c62x" Sep 4 15:50:03.354859 containerd[1644]: time="2025-09-04T15:50:03.354837121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-46x6b,Uid:d0b0e982-427f-4f85-bfc2-dcb3eacbe934,Namespace:kube-system,Attempt:0,}" Sep 4 15:50:03.434397 containerd[1644]: time="2025-09-04T15:50:03.434349162Z" level=info msg="connecting to shim 67752016fffc970c649dcdac1922da40d5d9dcd5d3fd79023bed89de9a3d13fd" address="unix:///run/containerd/s/5558ea0f868032de826b8ee84f80095aaf82b56b996fe11aaa8c4415f690b8a8" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:03.454371 systemd[1]: Started cri-containerd-67752016fffc970c649dcdac1922da40d5d9dcd5d3fd79023bed89de9a3d13fd.scope - libcontainer container 67752016fffc970c649dcdac1922da40d5d9dcd5d3fd79023bed89de9a3d13fd. Sep 4 15:50:03.595729 containerd[1644]: time="2025-09-04T15:50:03.595614501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-46x6b,Uid:d0b0e982-427f-4f85-bfc2-dcb3eacbe934,Namespace:kube-system,Attempt:0,} returns sandbox id \"67752016fffc970c649dcdac1922da40d5d9dcd5d3fd79023bed89de9a3d13fd\"" Sep 4 15:50:03.598918 containerd[1644]: time="2025-09-04T15:50:03.598816557Z" level=info msg="CreateContainer within sandbox \"67752016fffc970c649dcdac1922da40d5d9dcd5d3fd79023bed89de9a3d13fd\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 4 15:50:03.623955 containerd[1644]: time="2025-09-04T15:50:03.623921439Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-2c62x,Uid:2e64fef4-a130-4692-9b18-4ddacb28b8d5,Namespace:tigera-operator,Attempt:0,}" Sep 4 15:50:03.643215 containerd[1644]: time="2025-09-04T15:50:03.643186516Z" level=info msg="Container deaf702696743732c5d7ac688c51805bfa4cff2d38158ddd85fbcd35ef4a8f92: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:50:03.649141 containerd[1644]: time="2025-09-04T15:50:03.648950714Z" level=info msg="CreateContainer within sandbox \"67752016fffc970c649dcdac1922da40d5d9dcd5d3fd79023bed89de9a3d13fd\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"deaf702696743732c5d7ac688c51805bfa4cff2d38158ddd85fbcd35ef4a8f92\"" Sep 4 15:50:03.650499 containerd[1644]: time="2025-09-04T15:50:03.649675022Z" level=info msg="StartContainer for \"deaf702696743732c5d7ac688c51805bfa4cff2d38158ddd85fbcd35ef4a8f92\"" Sep 4 15:50:03.652593 containerd[1644]: time="2025-09-04T15:50:03.652539251Z" level=info msg="connecting to shim deaf702696743732c5d7ac688c51805bfa4cff2d38158ddd85fbcd35ef4a8f92" address="unix:///run/containerd/s/5558ea0f868032de826b8ee84f80095aaf82b56b996fe11aaa8c4415f690b8a8" protocol=ttrpc version=3 Sep 4 15:50:03.658129 containerd[1644]: time="2025-09-04T15:50:03.657931043Z" level=info msg="connecting to shim 3670364e2f9b3c161c314256ecb4378e41fc556695cc93000a5c00bc3619feb2" address="unix:///run/containerd/s/0511c5ab6ac79699d82b5894c0801abdb7cd31342ac48daf7e35ee6f6040c914" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:03.673384 systemd[1]: Started cri-containerd-deaf702696743732c5d7ac688c51805bfa4cff2d38158ddd85fbcd35ef4a8f92.scope - libcontainer container deaf702696743732c5d7ac688c51805bfa4cff2d38158ddd85fbcd35ef4a8f92. Sep 4 15:50:03.681252 systemd[1]: Started cri-containerd-3670364e2f9b3c161c314256ecb4378e41fc556695cc93000a5c00bc3619feb2.scope - libcontainer container 3670364e2f9b3c161c314256ecb4378e41fc556695cc93000a5c00bc3619feb2. Sep 4 15:50:03.722892 containerd[1644]: time="2025-09-04T15:50:03.722867910Z" level=info msg="StartContainer for \"deaf702696743732c5d7ac688c51805bfa4cff2d38158ddd85fbcd35ef4a8f92\" returns successfully" Sep 4 15:50:03.739306 containerd[1644]: time="2025-09-04T15:50:03.739280422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-2c62x,Uid:2e64fef4-a130-4692-9b18-4ddacb28b8d5,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3670364e2f9b3c161c314256ecb4378e41fc556695cc93000a5c00bc3619feb2\"" Sep 4 15:50:03.740579 containerd[1644]: time="2025-09-04T15:50:03.740563931Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 4 15:50:04.068051 kubelet[2944]: I0904 15:50:04.067873 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-46x6b" podStartSLOduration=1.067859971 podStartE2EDuration="1.067859971s" podCreationTimestamp="2025-09-04 15:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:50:04.067796899 +0000 UTC m=+6.132175901" watchObservedRunningTime="2025-09-04 15:50:04.067859971 +0000 UTC m=+6.132238973" Sep 4 15:50:04.161651 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2161228325.mount: Deactivated successfully. Sep 4 15:50:06.288454 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount897108735.mount: Deactivated successfully. Sep 4 15:50:06.857537 containerd[1644]: time="2025-09-04T15:50:06.857464315Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:06.857973 containerd[1644]: time="2025-09-04T15:50:06.857947091Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 4 15:50:06.858820 containerd[1644]: time="2025-09-04T15:50:06.858192957Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:06.859289 containerd[1644]: time="2025-09-04T15:50:06.859271657Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:06.859735 containerd[1644]: time="2025-09-04T15:50:06.859720993Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.11914047s" Sep 4 15:50:06.859784 containerd[1644]: time="2025-09-04T15:50:06.859775288Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 4 15:50:06.861242 containerd[1644]: time="2025-09-04T15:50:06.861228812Z" level=info msg="CreateContainer within sandbox \"3670364e2f9b3c161c314256ecb4378e41fc556695cc93000a5c00bc3619feb2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 4 15:50:06.866481 containerd[1644]: time="2025-09-04T15:50:06.866455393Z" level=info msg="Container 9e611ef48e03f51299325ed9111632d95624ef07db202cc6df5f1335eccb56e6: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:50:06.869734 containerd[1644]: time="2025-09-04T15:50:06.869712404Z" level=info msg="CreateContainer within sandbox \"3670364e2f9b3c161c314256ecb4378e41fc556695cc93000a5c00bc3619feb2\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9e611ef48e03f51299325ed9111632d95624ef07db202cc6df5f1335eccb56e6\"" Sep 4 15:50:06.870540 containerd[1644]: time="2025-09-04T15:50:06.870523335Z" level=info msg="StartContainer for \"9e611ef48e03f51299325ed9111632d95624ef07db202cc6df5f1335eccb56e6\"" Sep 4 15:50:06.871827 containerd[1644]: time="2025-09-04T15:50:06.871809483Z" level=info msg="connecting to shim 9e611ef48e03f51299325ed9111632d95624ef07db202cc6df5f1335eccb56e6" address="unix:///run/containerd/s/0511c5ab6ac79699d82b5894c0801abdb7cd31342ac48daf7e35ee6f6040c914" protocol=ttrpc version=3 Sep 4 15:50:06.888220 systemd[1]: Started cri-containerd-9e611ef48e03f51299325ed9111632d95624ef07db202cc6df5f1335eccb56e6.scope - libcontainer container 9e611ef48e03f51299325ed9111632d95624ef07db202cc6df5f1335eccb56e6. Sep 4 15:50:06.906192 containerd[1644]: time="2025-09-04T15:50:06.906165235Z" level=info msg="StartContainer for \"9e611ef48e03f51299325ed9111632d95624ef07db202cc6df5f1335eccb56e6\" returns successfully" Sep 4 15:50:08.106718 kubelet[2944]: I0904 15:50:08.105679 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-2c62x" podStartSLOduration=1.98557395 podStartE2EDuration="5.10566907s" podCreationTimestamp="2025-09-04 15:50:03 +0000 UTC" firstStartedPulling="2025-09-04 15:50:03.740147933 +0000 UTC m=+5.804526929" lastFinishedPulling="2025-09-04 15:50:06.860243058 +0000 UTC m=+8.924622049" observedRunningTime="2025-09-04 15:50:07.067428446 +0000 UTC m=+9.131807444" watchObservedRunningTime="2025-09-04 15:50:08.10566907 +0000 UTC m=+10.170048074" Sep 4 15:50:12.486788 sudo[1963]: pam_unix(sudo:session): session closed for user root Sep 4 15:50:12.488488 sshd[1962]: Connection closed by 139.178.89.65 port 38174 Sep 4 15:50:12.488436 sshd-session[1959]: pam_unix(sshd:session): session closed for user core Sep 4 15:50:12.491391 systemd[1]: sshd@6-139.178.70.104:22-139.178.89.65:38174.service: Deactivated successfully. Sep 4 15:50:12.494917 systemd[1]: session-9.scope: Deactivated successfully. Sep 4 15:50:12.496576 systemd[1]: session-9.scope: Consumed 3.347s CPU time, 150M memory peak. Sep 4 15:50:12.503017 systemd-logind[1618]: Session 9 logged out. Waiting for processes to exit. Sep 4 15:50:12.504339 systemd-logind[1618]: Removed session 9. Sep 4 15:50:15.178462 systemd[1]: Created slice kubepods-besteffort-pod2d1d5a03_31a8_4652_81e7_2f149495b1f9.slice - libcontainer container kubepods-besteffort-pod2d1d5a03_31a8_4652_81e7_2f149495b1f9.slice. Sep 4 15:50:15.261029 kubelet[2944]: I0904 15:50:15.261002 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d1d5a03-31a8-4652-81e7-2f149495b1f9-tigera-ca-bundle\") pod \"calico-typha-c8598bfbf-mvt6z\" (UID: \"2d1d5a03-31a8-4652-81e7-2f149495b1f9\") " pod="calico-system/calico-typha-c8598bfbf-mvt6z" Sep 4 15:50:15.261280 kubelet[2944]: I0904 15:50:15.261046 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/2d1d5a03-31a8-4652-81e7-2f149495b1f9-typha-certs\") pod \"calico-typha-c8598bfbf-mvt6z\" (UID: \"2d1d5a03-31a8-4652-81e7-2f149495b1f9\") " pod="calico-system/calico-typha-c8598bfbf-mvt6z" Sep 4 15:50:15.261280 kubelet[2944]: I0904 15:50:15.261058 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df72n\" (UniqueName: \"kubernetes.io/projected/2d1d5a03-31a8-4652-81e7-2f149495b1f9-kube-api-access-df72n\") pod \"calico-typha-c8598bfbf-mvt6z\" (UID: \"2d1d5a03-31a8-4652-81e7-2f149495b1f9\") " pod="calico-system/calico-typha-c8598bfbf-mvt6z" Sep 4 15:50:15.477140 systemd[1]: Created slice kubepods-besteffort-pod63b9fc28_8fe7_4e48_9c71_528960082fe7.slice - libcontainer container kubepods-besteffort-pod63b9fc28_8fe7_4e48_9c71_528960082fe7.slice. Sep 4 15:50:15.484659 containerd[1644]: time="2025-09-04T15:50:15.484632755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-c8598bfbf-mvt6z,Uid:2d1d5a03-31a8-4652-81e7-2f149495b1f9,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:15.562449 kubelet[2944]: I0904 15:50:15.562404 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-flexvol-driver-host\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562573 kubelet[2944]: I0904 15:50:15.562478 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-cni-bin-dir\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562573 kubelet[2944]: I0904 15:50:15.562495 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-var-run-calico\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562573 kubelet[2944]: I0904 15:50:15.562536 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-cni-log-dir\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562573 kubelet[2944]: I0904 15:50:15.562553 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-cni-net-dir\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562699 kubelet[2944]: I0904 15:50:15.562673 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-policysync\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562699 kubelet[2944]: I0904 15:50:15.562688 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-lib-modules\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562746 kubelet[2944]: I0904 15:50:15.562700 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/63b9fc28-8fe7-4e48-9c71-528960082fe7-node-certs\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562770 kubelet[2944]: I0904 15:50:15.562711 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63b9fc28-8fe7-4e48-9c71-528960082fe7-tigera-ca-bundle\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562797 kubelet[2944]: I0904 15:50:15.562769 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-xtables-lock\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562797 kubelet[2944]: I0904 15:50:15.562782 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjb6\" (UniqueName: \"kubernetes.io/projected/63b9fc28-8fe7-4e48-9c71-528960082fe7-kube-api-access-wqjb6\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.562842 kubelet[2944]: I0904 15:50:15.562803 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/63b9fc28-8fe7-4e48-9c71-528960082fe7-var-lib-calico\") pod \"calico-node-dvtfk\" (UID: \"63b9fc28-8fe7-4e48-9c71-528960082fe7\") " pod="calico-system/calico-node-dvtfk" Sep 4 15:50:15.596945 containerd[1644]: time="2025-09-04T15:50:15.596874499Z" level=info msg="connecting to shim 407b56d631cdbcf757f2cf2e7dee8311bf23dd0bf662edb3cfe0bbebca8bc31f" address="unix:///run/containerd/s/66e27912d7bb9735e3a06c82c051d99aae3f3db3f20ef9cc67b459887f5a6e26" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:15.622228 systemd[1]: Started cri-containerd-407b56d631cdbcf757f2cf2e7dee8311bf23dd0bf662edb3cfe0bbebca8bc31f.scope - libcontainer container 407b56d631cdbcf757f2cf2e7dee8311bf23dd0bf662edb3cfe0bbebca8bc31f. Sep 4 15:50:15.671338 kubelet[2944]: E0904 15:50:15.671297 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.671338 kubelet[2944]: W0904 15:50:15.671310 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.671338 kubelet[2944]: E0904 15:50:15.671322 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.765277 kubelet[2944]: E0904 15:50:15.765205 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.765277 kubelet[2944]: W0904 15:50:15.765224 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.765277 kubelet[2944]: E0904 15:50:15.765240 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.798392 kubelet[2944]: E0904 15:50:15.798359 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.798392 kubelet[2944]: W0904 15:50:15.798376 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.798392 kubelet[2944]: E0904 15:50:15.798393 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.834890 containerd[1644]: time="2025-09-04T15:50:15.834853997Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-c8598bfbf-mvt6z,Uid:2d1d5a03-31a8-4652-81e7-2f149495b1f9,Namespace:calico-system,Attempt:0,} returns sandbox id \"407b56d631cdbcf757f2cf2e7dee8311bf23dd0bf662edb3cfe0bbebca8bc31f\"" Sep 4 15:50:15.836969 containerd[1644]: time="2025-09-04T15:50:15.836912874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 4 15:50:15.844278 kubelet[2944]: E0904 15:50:15.843986 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:15.924635 kubelet[2944]: E0904 15:50:15.924611 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.924635 kubelet[2944]: W0904 15:50:15.924628 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.924782 kubelet[2944]: E0904 15:50:15.924643 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.924806 kubelet[2944]: E0904 15:50:15.924788 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.924806 kubelet[2944]: W0904 15:50:15.924794 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.924806 kubelet[2944]: E0904 15:50:15.924799 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.924924 kubelet[2944]: E0904 15:50:15.924916 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.924948 kubelet[2944]: W0904 15:50:15.924924 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.924948 kubelet[2944]: E0904 15:50:15.924931 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.925047 kubelet[2944]: E0904 15:50:15.925036 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.925047 kubelet[2944]: W0904 15:50:15.925044 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.930899 kubelet[2944]: E0904 15:50:15.925051 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.930899 kubelet[2944]: E0904 15:50:15.925220 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.930899 kubelet[2944]: W0904 15:50:15.925226 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.930899 kubelet[2944]: E0904 15:50:15.925230 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.930899 kubelet[2944]: E0904 15:50:15.925316 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.930899 kubelet[2944]: W0904 15:50:15.925320 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.930899 kubelet[2944]: E0904 15:50:15.925325 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.930899 kubelet[2944]: E0904 15:50:15.925426 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.930899 kubelet[2944]: W0904 15:50:15.925431 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.930899 kubelet[2944]: E0904 15:50:15.925436 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931575 kubelet[2944]: E0904 15:50:15.925554 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931575 kubelet[2944]: W0904 15:50:15.925559 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931575 kubelet[2944]: E0904 15:50:15.925564 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931575 kubelet[2944]: E0904 15:50:15.925692 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931575 kubelet[2944]: W0904 15:50:15.925698 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931575 kubelet[2944]: E0904 15:50:15.925703 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931575 kubelet[2944]: E0904 15:50:15.925794 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931575 kubelet[2944]: W0904 15:50:15.925798 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931575 kubelet[2944]: E0904 15:50:15.925803 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931575 kubelet[2944]: E0904 15:50:15.925888 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931751 kubelet[2944]: W0904 15:50:15.925892 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931751 kubelet[2944]: E0904 15:50:15.925897 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931751 kubelet[2944]: E0904 15:50:15.925990 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931751 kubelet[2944]: W0904 15:50:15.925994 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931751 kubelet[2944]: E0904 15:50:15.925999 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931751 kubelet[2944]: E0904 15:50:15.926088 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931751 kubelet[2944]: W0904 15:50:15.926092 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931751 kubelet[2944]: E0904 15:50:15.926096 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931751 kubelet[2944]: E0904 15:50:15.926201 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931751 kubelet[2944]: W0904 15:50:15.926209 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931912 kubelet[2944]: E0904 15:50:15.926214 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931912 kubelet[2944]: E0904 15:50:15.926298 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931912 kubelet[2944]: W0904 15:50:15.926302 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931912 kubelet[2944]: E0904 15:50:15.926306 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931912 kubelet[2944]: E0904 15:50:15.926395 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931912 kubelet[2944]: W0904 15:50:15.926400 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931912 kubelet[2944]: E0904 15:50:15.926405 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.931912 kubelet[2944]: E0904 15:50:15.926515 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.931912 kubelet[2944]: W0904 15:50:15.926519 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.931912 kubelet[2944]: E0904 15:50:15.926524 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.936779 kubelet[2944]: E0904 15:50:15.926629 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.936779 kubelet[2944]: W0904 15:50:15.926633 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.936779 kubelet[2944]: E0904 15:50:15.926637 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.936779 kubelet[2944]: E0904 15:50:15.926730 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.936779 kubelet[2944]: W0904 15:50:15.926735 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.936779 kubelet[2944]: E0904 15:50:15.926740 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.936779 kubelet[2944]: E0904 15:50:15.927176 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.936779 kubelet[2944]: W0904 15:50:15.927182 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.936779 kubelet[2944]: E0904 15:50:15.927187 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.967356 kubelet[2944]: E0904 15:50:15.967326 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.967356 kubelet[2944]: W0904 15:50:15.967347 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.967541 kubelet[2944]: E0904 15:50:15.967366 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.967541 kubelet[2944]: I0904 15:50:15.967392 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/983a0613-198a-42b5-9c35-64b2a2869e6c-socket-dir\") pod \"csi-node-driver-x2d7v\" (UID: \"983a0613-198a-42b5-9c35-64b2a2869e6c\") " pod="calico-system/csi-node-driver-x2d7v" Sep 4 15:50:15.967722 kubelet[2944]: E0904 15:50:15.967708 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.967722 kubelet[2944]: W0904 15:50:15.967719 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.967778 kubelet[2944]: E0904 15:50:15.967739 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.967778 kubelet[2944]: I0904 15:50:15.967755 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/983a0613-198a-42b5-9c35-64b2a2869e6c-varrun\") pod \"csi-node-driver-x2d7v\" (UID: \"983a0613-198a-42b5-9c35-64b2a2869e6c\") " pod="calico-system/csi-node-driver-x2d7v" Sep 4 15:50:15.967995 kubelet[2944]: E0904 15:50:15.967981 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.967995 kubelet[2944]: W0904 15:50:15.967991 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.968055 kubelet[2944]: E0904 15:50:15.968045 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.968088 kubelet[2944]: I0904 15:50:15.968061 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/983a0613-198a-42b5-9c35-64b2a2869e6c-kubelet-dir\") pod \"csi-node-driver-x2d7v\" (UID: \"983a0613-198a-42b5-9c35-64b2a2869e6c\") " pod="calico-system/csi-node-driver-x2d7v" Sep 4 15:50:15.968333 kubelet[2944]: E0904 15:50:15.968313 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.968333 kubelet[2944]: W0904 15:50:15.968324 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.968395 kubelet[2944]: E0904 15:50:15.968344 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.968498 kubelet[2944]: E0904 15:50:15.968486 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.968498 kubelet[2944]: W0904 15:50:15.968494 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.968559 kubelet[2944]: E0904 15:50:15.968501 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.968559 kubelet[2944]: I0904 15:50:15.968542 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8k9z\" (UniqueName: \"kubernetes.io/projected/983a0613-198a-42b5-9c35-64b2a2869e6c-kube-api-access-b8k9z\") pod \"csi-node-driver-x2d7v\" (UID: \"983a0613-198a-42b5-9c35-64b2a2869e6c\") " pod="calico-system/csi-node-driver-x2d7v" Sep 4 15:50:15.968677 kubelet[2944]: E0904 15:50:15.968664 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.968677 kubelet[2944]: W0904 15:50:15.968673 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.968830 kubelet[2944]: E0904 15:50:15.968683 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.968830 kubelet[2944]: E0904 15:50:15.968811 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.968830 kubelet[2944]: W0904 15:50:15.968816 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.968830 kubelet[2944]: E0904 15:50:15.968830 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.969027 kubelet[2944]: E0904 15:50:15.968978 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.969027 kubelet[2944]: W0904 15:50:15.968983 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.969027 kubelet[2944]: E0904 15:50:15.968995 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.969306 kubelet[2944]: E0904 15:50:15.969250 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.969306 kubelet[2944]: W0904 15:50:15.969280 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.969306 kubelet[2944]: E0904 15:50:15.969302 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.969990 kubelet[2944]: E0904 15:50:15.969982 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.970065 kubelet[2944]: W0904 15:50:15.969994 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.970065 kubelet[2944]: E0904 15:50:15.970030 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.970265 kubelet[2944]: E0904 15:50:15.970249 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.970318 kubelet[2944]: W0904 15:50:15.970262 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.970476 kubelet[2944]: E0904 15:50:15.970456 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.970583 kubelet[2944]: E0904 15:50:15.970568 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.970583 kubelet[2944]: W0904 15:50:15.970579 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.970628 kubelet[2944]: E0904 15:50:15.970591 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.970743 kubelet[2944]: I0904 15:50:15.970705 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/983a0613-198a-42b5-9c35-64b2a2869e6c-registration-dir\") pod \"csi-node-driver-x2d7v\" (UID: \"983a0613-198a-42b5-9c35-64b2a2869e6c\") " pod="calico-system/csi-node-driver-x2d7v" Sep 4 15:50:15.971035 kubelet[2944]: E0904 15:50:15.970815 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.971035 kubelet[2944]: W0904 15:50:15.970825 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.971035 kubelet[2944]: E0904 15:50:15.970835 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.971365 kubelet[2944]: E0904 15:50:15.971258 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.971365 kubelet[2944]: W0904 15:50:15.971264 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.971365 kubelet[2944]: E0904 15:50:15.971271 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:15.972080 kubelet[2944]: E0904 15:50:15.971957 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:15.972080 kubelet[2944]: W0904 15:50:15.971966 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:15.972080 kubelet[2944]: E0904 15:50:15.971978 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.072490 kubelet[2944]: E0904 15:50:16.071820 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.072490 kubelet[2944]: W0904 15:50:16.071839 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.072490 kubelet[2944]: E0904 15:50:16.071854 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.072490 kubelet[2944]: E0904 15:50:16.072145 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.072490 kubelet[2944]: W0904 15:50:16.072152 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.072490 kubelet[2944]: E0904 15:50:16.072160 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.072490 kubelet[2944]: E0904 15:50:16.072434 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.072490 kubelet[2944]: W0904 15:50:16.072442 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.072490 kubelet[2944]: E0904 15:50:16.072461 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.074892 kubelet[2944]: E0904 15:50:16.073762 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.074892 kubelet[2944]: W0904 15:50:16.073774 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.074892 kubelet[2944]: E0904 15:50:16.073784 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.074892 kubelet[2944]: E0904 15:50:16.074402 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.074892 kubelet[2944]: W0904 15:50:16.074410 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.074892 kubelet[2944]: E0904 15:50:16.074431 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.075498 kubelet[2944]: E0904 15:50:16.074922 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.075498 kubelet[2944]: W0904 15:50:16.074930 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.075498 kubelet[2944]: E0904 15:50:16.075221 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.075933 kubelet[2944]: E0904 15:50:16.075917 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.075933 kubelet[2944]: W0904 15:50:16.075931 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.077247 kubelet[2944]: E0904 15:50:16.076640 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.077437 kubelet[2944]: E0904 15:50:16.077420 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.077437 kubelet[2944]: W0904 15:50:16.077431 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.077647 kubelet[2944]: E0904 15:50:16.077594 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.077812 kubelet[2944]: E0904 15:50:16.077799 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.077812 kubelet[2944]: W0904 15:50:16.077808 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.077973 kubelet[2944]: E0904 15:50:16.077831 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.078333 kubelet[2944]: E0904 15:50:16.078319 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.078333 kubelet[2944]: W0904 15:50:16.078329 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.078514 kubelet[2944]: E0904 15:50:16.078353 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.080153 kubelet[2944]: E0904 15:50:16.079638 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.080153 kubelet[2944]: W0904 15:50:16.079650 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.080585 kubelet[2944]: E0904 15:50:16.080554 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.081997 kubelet[2944]: E0904 15:50:16.081833 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.081997 kubelet[2944]: W0904 15:50:16.081843 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.082103 kubelet[2944]: E0904 15:50:16.082087 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.082169 kubelet[2944]: E0904 15:50:16.082144 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.082169 kubelet[2944]: W0904 15:50:16.082150 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.083201 kubelet[2944]: E0904 15:50:16.082385 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.083201 kubelet[2944]: W0904 15:50:16.082393 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.083201 kubelet[2944]: E0904 15:50:16.082704 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.083201 kubelet[2944]: E0904 15:50:16.082721 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.084358 kubelet[2944]: E0904 15:50:16.083695 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.084358 kubelet[2944]: W0904 15:50:16.083703 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.084358 kubelet[2944]: E0904 15:50:16.084150 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.084358 kubelet[2944]: W0904 15:50:16.084158 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.084358 kubelet[2944]: E0904 15:50:16.084279 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.084358 kubelet[2944]: E0904 15:50:16.084294 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.084554 kubelet[2944]: E0904 15:50:16.084418 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.084554 kubelet[2944]: W0904 15:50:16.084425 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.084674 kubelet[2944]: E0904 15:50:16.084662 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.085845 kubelet[2944]: E0904 15:50:16.084899 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.085845 kubelet[2944]: W0904 15:50:16.084908 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.085845 kubelet[2944]: E0904 15:50:16.085312 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.085845 kubelet[2944]: W0904 15:50:16.085318 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.085845 kubelet[2944]: E0904 15:50:16.085508 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.085845 kubelet[2944]: W0904 15:50:16.085515 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.085845 kubelet[2944]: E0904 15:50:16.085667 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.085845 kubelet[2944]: E0904 15:50:16.085688 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.085845 kubelet[2944]: E0904 15:50:16.085759 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.085845 kubelet[2944]: E0904 15:50:16.085806 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.086261 kubelet[2944]: W0904 15:50:16.085814 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.086261 kubelet[2944]: E0904 15:50:16.086077 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.086261 kubelet[2944]: W0904 15:50:16.086083 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.086261 kubelet[2944]: E0904 15:50:16.086195 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.086261 kubelet[2944]: E0904 15:50:16.086217 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.086397 kubelet[2944]: E0904 15:50:16.086299 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.086397 kubelet[2944]: W0904 15:50:16.086306 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.086449 kubelet[2944]: E0904 15:50:16.086403 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.087613 kubelet[2944]: E0904 15:50:16.086978 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.087613 kubelet[2944]: W0904 15:50:16.086996 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.087613 kubelet[2944]: E0904 15:50:16.087091 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.087613 kubelet[2944]: E0904 15:50:16.087305 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.087613 kubelet[2944]: W0904 15:50:16.087321 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.087613 kubelet[2944]: E0904 15:50:16.087328 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.095857 kubelet[2944]: E0904 15:50:16.095739 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:16.095857 kubelet[2944]: W0904 15:50:16.095756 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:16.095857 kubelet[2944]: E0904 15:50:16.095778 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:16.097601 containerd[1644]: time="2025-09-04T15:50:16.097323830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dvtfk,Uid:63b9fc28-8fe7-4e48-9c71-528960082fe7,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:16.113915 containerd[1644]: time="2025-09-04T15:50:16.113880912Z" level=info msg="connecting to shim e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383" address="unix:///run/containerd/s/b13b44512d017b7cb6da37d8a85f112dd9100662f4913ab5caff43dcb30a8824" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:16.138694 systemd[1]: Started cri-containerd-e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383.scope - libcontainer container e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383. Sep 4 15:50:16.160923 containerd[1644]: time="2025-09-04T15:50:16.160898646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dvtfk,Uid:63b9fc28-8fe7-4e48-9c71-528960082fe7,Namespace:calico-system,Attempt:0,} returns sandbox id \"e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383\"" Sep 4 15:50:17.025904 kubelet[2944]: E0904 15:50:17.025868 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:19.025106 kubelet[2944]: E0904 15:50:19.025044 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:21.025227 kubelet[2944]: E0904 15:50:21.025177 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:23.026061 kubelet[2944]: E0904 15:50:23.025802 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:25.026146 kubelet[2944]: E0904 15:50:25.025894 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:27.026954 kubelet[2944]: E0904 15:50:27.025944 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:28.132134 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount208126633.mount: Deactivated successfully. Sep 4 15:50:28.810494 containerd[1644]: time="2025-09-04T15:50:28.810456498Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:28.811456 containerd[1644]: time="2025-09-04T15:50:28.811428986Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 4 15:50:28.811756 containerd[1644]: time="2025-09-04T15:50:28.811733781Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:28.813605 containerd[1644]: time="2025-09-04T15:50:28.813582818Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:28.814390 containerd[1644]: time="2025-09-04T15:50:28.813926867Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 12.976946091s" Sep 4 15:50:28.814390 containerd[1644]: time="2025-09-04T15:50:28.813940856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 4 15:50:28.816451 containerd[1644]: time="2025-09-04T15:50:28.815845149Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 4 15:50:28.825593 containerd[1644]: time="2025-09-04T15:50:28.825549175Z" level=info msg="CreateContainer within sandbox \"407b56d631cdbcf757f2cf2e7dee8311bf23dd0bf662edb3cfe0bbebca8bc31f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 4 15:50:28.869336 containerd[1644]: time="2025-09-04T15:50:28.868370793Z" level=info msg="Container a9eb842f93ea88bff33c66477e2e82eae4e85e44e39f6e0b9144331b2f0d75b2: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:50:28.880558 containerd[1644]: time="2025-09-04T15:50:28.880496264Z" level=info msg="CreateContainer within sandbox \"407b56d631cdbcf757f2cf2e7dee8311bf23dd0bf662edb3cfe0bbebca8bc31f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a9eb842f93ea88bff33c66477e2e82eae4e85e44e39f6e0b9144331b2f0d75b2\"" Sep 4 15:50:28.881663 containerd[1644]: time="2025-09-04T15:50:28.880859102Z" level=info msg="StartContainer for \"a9eb842f93ea88bff33c66477e2e82eae4e85e44e39f6e0b9144331b2f0d75b2\"" Sep 4 15:50:28.882325 containerd[1644]: time="2025-09-04T15:50:28.882288308Z" level=info msg="connecting to shim a9eb842f93ea88bff33c66477e2e82eae4e85e44e39f6e0b9144331b2f0d75b2" address="unix:///run/containerd/s/66e27912d7bb9735e3a06c82c051d99aae3f3db3f20ef9cc67b459887f5a6e26" protocol=ttrpc version=3 Sep 4 15:50:28.907274 systemd[1]: Started cri-containerd-a9eb842f93ea88bff33c66477e2e82eae4e85e44e39f6e0b9144331b2f0d75b2.scope - libcontainer container a9eb842f93ea88bff33c66477e2e82eae4e85e44e39f6e0b9144331b2f0d75b2. Sep 4 15:50:28.969895 containerd[1644]: time="2025-09-04T15:50:28.969868535Z" level=info msg="StartContainer for \"a9eb842f93ea88bff33c66477e2e82eae4e85e44e39f6e0b9144331b2f0d75b2\" returns successfully" Sep 4 15:50:29.025644 kubelet[2944]: E0904 15:50:29.025468 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:29.212483 kubelet[2944]: E0904 15:50:29.212422 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.212483 kubelet[2944]: W0904 15:50:29.212439 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.212483 kubelet[2944]: E0904 15:50:29.212455 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.214843 kubelet[2944]: E0904 15:50:29.212831 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.214843 kubelet[2944]: W0904 15:50:29.212837 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.214843 kubelet[2944]: E0904 15:50:29.212843 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.214843 kubelet[2944]: E0904 15:50:29.212931 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.214843 kubelet[2944]: W0904 15:50:29.212935 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.214843 kubelet[2944]: E0904 15:50:29.212940 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.214843 kubelet[2944]: E0904 15:50:29.213038 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.214843 kubelet[2944]: W0904 15:50:29.213043 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.214843 kubelet[2944]: E0904 15:50:29.213048 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.214843 kubelet[2944]: E0904 15:50:29.213165 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215276 kubelet[2944]: W0904 15:50:29.213170 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215276 kubelet[2944]: E0904 15:50:29.213174 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215276 kubelet[2944]: E0904 15:50:29.213252 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215276 kubelet[2944]: W0904 15:50:29.213256 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215276 kubelet[2944]: E0904 15:50:29.213260 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215276 kubelet[2944]: E0904 15:50:29.213343 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215276 kubelet[2944]: W0904 15:50:29.213347 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215276 kubelet[2944]: E0904 15:50:29.213352 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215276 kubelet[2944]: E0904 15:50:29.213436 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215276 kubelet[2944]: W0904 15:50:29.213441 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215450 kubelet[2944]: E0904 15:50:29.213447 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215450 kubelet[2944]: E0904 15:50:29.213556 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215450 kubelet[2944]: W0904 15:50:29.213560 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215450 kubelet[2944]: E0904 15:50:29.213566 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215450 kubelet[2944]: E0904 15:50:29.213649 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215450 kubelet[2944]: W0904 15:50:29.213654 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215450 kubelet[2944]: E0904 15:50:29.213659 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215450 kubelet[2944]: E0904 15:50:29.213736 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215450 kubelet[2944]: W0904 15:50:29.213741 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215450 kubelet[2944]: E0904 15:50:29.213745 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215625 kubelet[2944]: E0904 15:50:29.213873 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215625 kubelet[2944]: W0904 15:50:29.213877 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215625 kubelet[2944]: E0904 15:50:29.213883 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215625 kubelet[2944]: E0904 15:50:29.214009 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215625 kubelet[2944]: W0904 15:50:29.214014 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215625 kubelet[2944]: E0904 15:50:29.214020 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215625 kubelet[2944]: E0904 15:50:29.214105 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215625 kubelet[2944]: W0904 15:50:29.214110 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215625 kubelet[2944]: E0904 15:50:29.214122 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.215625 kubelet[2944]: E0904 15:50:29.214213 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.215784 kubelet[2944]: W0904 15:50:29.214217 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.215784 kubelet[2944]: E0904 15:50:29.214221 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.261212 kubelet[2944]: E0904 15:50:29.261191 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.261212 kubelet[2944]: W0904 15:50:29.261207 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.261317 kubelet[2944]: E0904 15:50:29.261221 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.261351 kubelet[2944]: E0904 15:50:29.261341 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.261351 kubelet[2944]: W0904 15:50:29.261348 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.261430 kubelet[2944]: E0904 15:50:29.261355 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.261456 kubelet[2944]: E0904 15:50:29.261441 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.261456 kubelet[2944]: W0904 15:50:29.261445 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.261456 kubelet[2944]: E0904 15:50:29.261451 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.261620 kubelet[2944]: E0904 15:50:29.261611 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.261656 kubelet[2944]: W0904 15:50:29.261649 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.261699 kubelet[2944]: E0904 15:50:29.261692 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.261799 kubelet[2944]: E0904 15:50:29.261788 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.261799 kubelet[2944]: W0904 15:50:29.261796 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.261844 kubelet[2944]: E0904 15:50:29.261804 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.261890 kubelet[2944]: E0904 15:50:29.261880 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.261890 kubelet[2944]: W0904 15:50:29.261888 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.261943 kubelet[2944]: E0904 15:50:29.261894 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.261975 kubelet[2944]: E0904 15:50:29.261970 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.261975 kubelet[2944]: W0904 15:50:29.261974 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.262011 kubelet[2944]: E0904 15:50:29.261986 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.262136 kubelet[2944]: E0904 15:50:29.262112 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.262136 kubelet[2944]: W0904 15:50:29.262134 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.262181 kubelet[2944]: E0904 15:50:29.262148 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.262252 kubelet[2944]: E0904 15:50:29.262244 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.262392 kubelet[2944]: W0904 15:50:29.262377 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.262392 kubelet[2944]: E0904 15:50:29.262389 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.262469 kubelet[2944]: E0904 15:50:29.262458 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.262469 kubelet[2944]: W0904 15:50:29.262466 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.262540 kubelet[2944]: E0904 15:50:29.262485 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.262575 kubelet[2944]: E0904 15:50:29.262564 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.262575 kubelet[2944]: W0904 15:50:29.262572 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.262674 kubelet[2944]: E0904 15:50:29.262580 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.262726 kubelet[2944]: E0904 15:50:29.262719 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.262774 kubelet[2944]: W0904 15:50:29.262766 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.262827 kubelet[2944]: E0904 15:50:29.262819 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.263020 kubelet[2944]: E0904 15:50:29.262943 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.263020 kubelet[2944]: W0904 15:50:29.262950 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.263020 kubelet[2944]: E0904 15:50:29.262959 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.263122 kubelet[2944]: E0904 15:50:29.263109 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.263161 kubelet[2944]: W0904 15:50:29.263155 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.263204 kubelet[2944]: E0904 15:50:29.263197 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.263287 kubelet[2944]: E0904 15:50:29.263276 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.263287 kubelet[2944]: W0904 15:50:29.263285 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.263328 kubelet[2944]: E0904 15:50:29.263292 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.263397 kubelet[2944]: E0904 15:50:29.263388 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.263397 kubelet[2944]: W0904 15:50:29.263395 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.263439 kubelet[2944]: E0904 15:50:29.263400 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.263583 kubelet[2944]: E0904 15:50:29.263570 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.263583 kubelet[2944]: W0904 15:50:29.263579 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.263630 kubelet[2944]: E0904 15:50:29.263590 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:29.263685 kubelet[2944]: E0904 15:50:29.263676 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:29.263685 kubelet[2944]: W0904 15:50:29.263684 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:29.263726 kubelet[2944]: E0904 15:50:29.263689 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.134592 kubelet[2944]: I0904 15:50:30.133961 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-c8598bfbf-mvt6z" podStartSLOduration=2.155884208 podStartE2EDuration="15.133947722s" podCreationTimestamp="2025-09-04 15:50:15 +0000 UTC" firstStartedPulling="2025-09-04 15:50:15.836685309 +0000 UTC m=+17.901064302" lastFinishedPulling="2025-09-04 15:50:28.814748812 +0000 UTC m=+30.879127816" observedRunningTime="2025-09-04 15:50:29.153191331 +0000 UTC m=+31.217570333" watchObservedRunningTime="2025-09-04 15:50:30.133947722 +0000 UTC m=+32.198326726" Sep 4 15:50:30.221837 kubelet[2944]: E0904 15:50:30.221785 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.221837 kubelet[2944]: W0904 15:50:30.221798 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.221837 kubelet[2944]: E0904 15:50:30.221811 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.222125 kubelet[2944]: E0904 15:50:30.222089 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.222125 kubelet[2944]: W0904 15:50:30.222096 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.222125 kubelet[2944]: E0904 15:50:30.222102 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.222377 kubelet[2944]: E0904 15:50:30.222356 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.222377 kubelet[2944]: W0904 15:50:30.222362 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.222458 kubelet[2944]: E0904 15:50:30.222368 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.222627 kubelet[2944]: E0904 15:50:30.222590 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.222627 kubelet[2944]: W0904 15:50:30.222598 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.222627 kubelet[2944]: E0904 15:50:30.222603 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.222816 kubelet[2944]: E0904 15:50:30.222786 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.222816 kubelet[2944]: W0904 15:50:30.222793 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.222816 kubelet[2944]: E0904 15:50:30.222798 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.223050 kubelet[2944]: E0904 15:50:30.223017 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.223050 kubelet[2944]: W0904 15:50:30.223023 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.223050 kubelet[2944]: E0904 15:50:30.223028 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.223286 kubelet[2944]: E0904 15:50:30.223240 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.223286 kubelet[2944]: W0904 15:50:30.223247 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.223415 kubelet[2944]: E0904 15:50:30.223306 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.223534 kubelet[2944]: E0904 15:50:30.223520 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.223599 kubelet[2944]: W0904 15:50:30.223526 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.223599 kubelet[2944]: E0904 15:50:30.223575 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.223750 kubelet[2944]: E0904 15:50:30.223745 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.223809 kubelet[2944]: W0904 15:50:30.223781 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.223809 kubelet[2944]: E0904 15:50:30.223789 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.223990 kubelet[2944]: E0904 15:50:30.223959 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.223990 kubelet[2944]: W0904 15:50:30.223965 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.223990 kubelet[2944]: E0904 15:50:30.223970 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.224227 kubelet[2944]: E0904 15:50:30.224188 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.224227 kubelet[2944]: W0904 15:50:30.224195 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.224227 kubelet[2944]: E0904 15:50:30.224200 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.224402 kubelet[2944]: E0904 15:50:30.224367 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.224402 kubelet[2944]: W0904 15:50:30.224373 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.224402 kubelet[2944]: E0904 15:50:30.224379 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.224573 kubelet[2944]: E0904 15:50:30.224564 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.224651 kubelet[2944]: W0904 15:50:30.224632 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.224726 kubelet[2944]: E0904 15:50:30.224677 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.224875 kubelet[2944]: E0904 15:50:30.224857 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.224875 kubelet[2944]: W0904 15:50:30.224863 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.224969 kubelet[2944]: E0904 15:50:30.224924 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.225031 kubelet[2944]: E0904 15:50:30.225026 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.225082 kubelet[2944]: W0904 15:50:30.225063 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.225082 kubelet[2944]: E0904 15:50:30.225072 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.267491 kubelet[2944]: E0904 15:50:30.267469 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.267661 kubelet[2944]: W0904 15:50:30.267537 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.267661 kubelet[2944]: E0904 15:50:30.267552 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.267898 kubelet[2944]: E0904 15:50:30.267885 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.267898 kubelet[2944]: W0904 15:50:30.267891 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.267981 kubelet[2944]: E0904 15:50:30.267945 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.268177 kubelet[2944]: E0904 15:50:30.268163 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.268177 kubelet[2944]: W0904 15:50:30.268170 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.268289 kubelet[2944]: E0904 15:50:30.268212 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.268447 kubelet[2944]: E0904 15:50:30.268434 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.268447 kubelet[2944]: W0904 15:50:30.268440 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.268620 kubelet[2944]: E0904 15:50:30.268496 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.268675 kubelet[2944]: E0904 15:50:30.268669 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.268716 kubelet[2944]: W0904 15:50:30.268710 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.268792 kubelet[2944]: E0904 15:50:30.268786 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.268945 kubelet[2944]: E0904 15:50:30.268933 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.268945 kubelet[2944]: W0904 15:50:30.268939 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.269069 kubelet[2944]: E0904 15:50:30.269031 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.269158 kubelet[2944]: E0904 15:50:30.269145 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.269158 kubelet[2944]: W0904 15:50:30.269151 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.269274 kubelet[2944]: E0904 15:50:30.269206 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.269414 kubelet[2944]: E0904 15:50:30.269401 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.269414 kubelet[2944]: W0904 15:50:30.269407 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.269662 kubelet[2944]: E0904 15:50:30.269564 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.269718 kubelet[2944]: E0904 15:50:30.269712 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.269808 kubelet[2944]: W0904 15:50:30.269753 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.269889 kubelet[2944]: E0904 15:50:30.269875 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.269943 kubelet[2944]: E0904 15:50:30.269930 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.270022 kubelet[2944]: W0904 15:50:30.269976 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.270106 kubelet[2944]: E0904 15:50:30.270067 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.270212 kubelet[2944]: E0904 15:50:30.270191 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.270212 kubelet[2944]: W0904 15:50:30.270205 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.270425 kubelet[2944]: E0904 15:50:30.270287 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.270506 kubelet[2944]: E0904 15:50:30.270501 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.270550 kubelet[2944]: W0904 15:50:30.270539 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.270609 kubelet[2944]: E0904 15:50:30.270581 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.270717 kubelet[2944]: E0904 15:50:30.270705 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.270765 kubelet[2944]: W0904 15:50:30.270754 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.270827 kubelet[2944]: E0904 15:50:30.270795 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.271009 kubelet[2944]: E0904 15:50:30.270995 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.271009 kubelet[2944]: W0904 15:50:30.271002 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.271154 kubelet[2944]: E0904 15:50:30.271052 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.271392 kubelet[2944]: E0904 15:50:30.271370 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.271512 kubelet[2944]: W0904 15:50:30.271506 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.271552 kubelet[2944]: E0904 15:50:30.271547 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.271716 kubelet[2944]: E0904 15:50:30.271704 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.271716 kubelet[2944]: W0904 15:50:30.271710 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.271855 kubelet[2944]: E0904 15:50:30.271764 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.272105 kubelet[2944]: E0904 15:50:30.272091 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.272105 kubelet[2944]: W0904 15:50:30.272097 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.272249 kubelet[2944]: E0904 15:50:30.272188 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.272378 kubelet[2944]: E0904 15:50:30.272353 2944 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:50:30.272378 kubelet[2944]: W0904 15:50:30.272359 2944 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:50:30.272378 kubelet[2944]: E0904 15:50:30.272365 2944 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:50:30.320981 containerd[1644]: time="2025-09-04T15:50:30.320933246Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:30.321431 containerd[1644]: time="2025-09-04T15:50:30.321333945Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 4 15:50:30.321773 containerd[1644]: time="2025-09-04T15:50:30.321703778Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:30.322720 containerd[1644]: time="2025-09-04T15:50:30.322705555Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:30.323136 containerd[1644]: time="2025-09-04T15:50:30.323103100Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.507233807s" Sep 4 15:50:30.323170 containerd[1644]: time="2025-09-04T15:50:30.323141486Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 4 15:50:30.325523 containerd[1644]: time="2025-09-04T15:50:30.325507645Z" level=info msg="CreateContainer within sandbox \"e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 4 15:50:30.369836 containerd[1644]: time="2025-09-04T15:50:30.369735611Z" level=info msg="Container c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:50:30.375112 containerd[1644]: time="2025-09-04T15:50:30.375081277Z" level=info msg="CreateContainer within sandbox \"e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde\"" Sep 4 15:50:30.375618 containerd[1644]: time="2025-09-04T15:50:30.375480716Z" level=info msg="StartContainer for \"c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde\"" Sep 4 15:50:30.392512 containerd[1644]: time="2025-09-04T15:50:30.392450764Z" level=info msg="connecting to shim c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde" address="unix:///run/containerd/s/b13b44512d017b7cb6da37d8a85f112dd9100662f4913ab5caff43dcb30a8824" protocol=ttrpc version=3 Sep 4 15:50:30.416268 systemd[1]: Started cri-containerd-c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde.scope - libcontainer container c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde. Sep 4 15:50:30.441312 containerd[1644]: time="2025-09-04T15:50:30.441254431Z" level=info msg="StartContainer for \"c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde\" returns successfully" Sep 4 15:50:30.448073 systemd[1]: cri-containerd-c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde.scope: Deactivated successfully. Sep 4 15:50:30.457225 containerd[1644]: time="2025-09-04T15:50:30.457093237Z" level=info msg="received exit event container_id:\"c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde\" id:\"c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde\" pid:3664 exited_at:{seconds:1757001030 nanos:449869796}" Sep 4 15:50:30.464902 containerd[1644]: time="2025-09-04T15:50:30.464823706Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde\" id:\"c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde\" pid:3664 exited_at:{seconds:1757001030 nanos:449869796}" Sep 4 15:50:30.481898 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c6a5160cb99cdb2281c6603ecf34f8d5a454ad8d6f1eabd8b31f3d344f817bde-rootfs.mount: Deactivated successfully. Sep 4 15:50:31.026452 kubelet[2944]: E0904 15:50:31.026425 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:31.129566 containerd[1644]: time="2025-09-04T15:50:31.129535355Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 4 15:50:33.025169 kubelet[2944]: E0904 15:50:33.025111 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:35.025943 kubelet[2944]: E0904 15:50:35.025159 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:37.025870 kubelet[2944]: E0904 15:50:37.025789 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:39.025937 kubelet[2944]: E0904 15:50:39.025621 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:39.144415 containerd[1644]: time="2025-09-04T15:50:39.143906212Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:39.171196 containerd[1644]: time="2025-09-04T15:50:39.171156195Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 4 15:50:39.192129 containerd[1644]: time="2025-09-04T15:50:39.192028916Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:39.193334 containerd[1644]: time="2025-09-04T15:50:39.193312422Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:39.194046 containerd[1644]: time="2025-09-04T15:50:39.193996154Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 8.064433793s" Sep 4 15:50:39.194046 containerd[1644]: time="2025-09-04T15:50:39.194011713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 4 15:50:39.201907 containerd[1644]: time="2025-09-04T15:50:39.201710819Z" level=info msg="CreateContainer within sandbox \"e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 4 15:50:39.210175 containerd[1644]: time="2025-09-04T15:50:39.210108528Z" level=info msg="Container becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:50:39.217460 containerd[1644]: time="2025-09-04T15:50:39.217395406Z" level=info msg="CreateContainer within sandbox \"e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f\"" Sep 4 15:50:39.222225 containerd[1644]: time="2025-09-04T15:50:39.222177569Z" level=info msg="StartContainer for \"becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f\"" Sep 4 15:50:39.224036 containerd[1644]: time="2025-09-04T15:50:39.223982876Z" level=info msg="connecting to shim becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f" address="unix:///run/containerd/s/b13b44512d017b7cb6da37d8a85f112dd9100662f4913ab5caff43dcb30a8824" protocol=ttrpc version=3 Sep 4 15:50:39.245316 systemd[1]: Started cri-containerd-becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f.scope - libcontainer container becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f. Sep 4 15:50:39.299191 containerd[1644]: time="2025-09-04T15:50:39.298718560Z" level=info msg="StartContainer for \"becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f\" returns successfully" Sep 4 15:50:40.994829 systemd[1]: cri-containerd-becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f.scope: Deactivated successfully. Sep 4 15:50:40.996509 systemd[1]: cri-containerd-becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f.scope: Consumed 302ms CPU time, 162.8M memory peak, 2.3M read from disk, 171.3M written to disk. Sep 4 15:50:41.026549 kubelet[2944]: E0904 15:50:41.025636 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:41.031159 containerd[1644]: time="2025-09-04T15:50:41.030257908Z" level=info msg="TaskExit event in podsandbox handler container_id:\"becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f\" id:\"becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f\" pid:3724 exited_at:{seconds:1757001041 nanos:14995702}" Sep 4 15:50:41.031159 containerd[1644]: time="2025-09-04T15:50:41.030303820Z" level=info msg="received exit event container_id:\"becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f\" id:\"becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f\" pid:3724 exited_at:{seconds:1757001041 nanos:14995702}" Sep 4 15:50:41.075561 kubelet[2944]: I0904 15:50:41.075527 2944 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 4 15:50:41.098300 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-becb70bb2894aaa66ac4fe9ed40a7afa50b014a5d84465c90aa3df1c23edef6f-rootfs.mount: Deactivated successfully. Sep 4 15:50:41.122123 systemd[1]: Created slice kubepods-besteffort-pod008bcc6d_c796_4431_bff8_3bdfe6b21bb6.slice - libcontainer container kubepods-besteffort-pod008bcc6d_c796_4431_bff8_3bdfe6b21bb6.slice. Sep 4 15:50:41.130045 systemd[1]: Created slice kubepods-besteffort-podb3ebd254_9b08_41b4_bd75_06a3a045be62.slice - libcontainer container kubepods-besteffort-podb3ebd254_9b08_41b4_bd75_06a3a045be62.slice. Sep 4 15:50:41.139684 kubelet[2944]: I0904 15:50:41.139662 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zmn\" (UniqueName: \"kubernetes.io/projected/058617f0-a421-4b0b-af75-94fdfe834c56-kube-api-access-t5zmn\") pod \"calico-apiserver-d4584f895-xqpkf\" (UID: \"058617f0-a421-4b0b-af75-94fdfe834c56\") " pod="calico-apiserver/calico-apiserver-d4584f895-xqpkf" Sep 4 15:50:41.139684 kubelet[2944]: I0904 15:50:41.139685 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s458v\" (UniqueName: \"kubernetes.io/projected/008bcc6d-c796-4431-bff8-3bdfe6b21bb6-kube-api-access-s458v\") pod \"calico-kube-controllers-7454479f5c-2sn5z\" (UID: \"008bcc6d-c796-4431-bff8-3bdfe6b21bb6\") " pod="calico-system/calico-kube-controllers-7454479f5c-2sn5z" Sep 4 15:50:41.140665 kubelet[2944]: I0904 15:50:41.139699 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/501a1bfe-524e-4e97-9ddd-148de8480fe6-config-volume\") pod \"coredns-7c65d6cfc9-l6fg8\" (UID: \"501a1bfe-524e-4e97-9ddd-148de8480fe6\") " pod="kube-system/coredns-7c65d6cfc9-l6fg8" Sep 4 15:50:41.140665 kubelet[2944]: I0904 15:50:41.139709 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3ebd254-9b08-41b4-bd75-06a3a045be62-goldmane-ca-bundle\") pod \"goldmane-7988f88666-574qn\" (UID: \"b3ebd254-9b08-41b4-bd75-06a3a045be62\") " pod="calico-system/goldmane-7988f88666-574qn" Sep 4 15:50:41.140665 kubelet[2944]: I0904 15:50:41.139718 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/b3ebd254-9b08-41b4-bd75-06a3a045be62-goldmane-key-pair\") pod \"goldmane-7988f88666-574qn\" (UID: \"b3ebd254-9b08-41b4-bd75-06a3a045be62\") " pod="calico-system/goldmane-7988f88666-574qn" Sep 4 15:50:41.140665 kubelet[2944]: I0904 15:50:41.139733 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ncv\" (UniqueName: \"kubernetes.io/projected/501a1bfe-524e-4e97-9ddd-148de8480fe6-kube-api-access-b9ncv\") pod \"coredns-7c65d6cfc9-l6fg8\" (UID: \"501a1bfe-524e-4e97-9ddd-148de8480fe6\") " pod="kube-system/coredns-7c65d6cfc9-l6fg8" Sep 4 15:50:41.140665 kubelet[2944]: I0904 15:50:41.139743 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz4k7\" (UniqueName: \"kubernetes.io/projected/b3ebd254-9b08-41b4-bd75-06a3a045be62-kube-api-access-gz4k7\") pod \"goldmane-7988f88666-574qn\" (UID: \"b3ebd254-9b08-41b4-bd75-06a3a045be62\") " pod="calico-system/goldmane-7988f88666-574qn" Sep 4 15:50:41.140794 kubelet[2944]: I0904 15:50:41.139754 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/008bcc6d-c796-4431-bff8-3bdfe6b21bb6-tigera-ca-bundle\") pod \"calico-kube-controllers-7454479f5c-2sn5z\" (UID: \"008bcc6d-c796-4431-bff8-3bdfe6b21bb6\") " pod="calico-system/calico-kube-controllers-7454479f5c-2sn5z" Sep 4 15:50:41.140794 kubelet[2944]: I0904 15:50:41.139764 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/058617f0-a421-4b0b-af75-94fdfe834c56-calico-apiserver-certs\") pod \"calico-apiserver-d4584f895-xqpkf\" (UID: \"058617f0-a421-4b0b-af75-94fdfe834c56\") " pod="calico-apiserver/calico-apiserver-d4584f895-xqpkf" Sep 4 15:50:41.140794 kubelet[2944]: I0904 15:50:41.139775 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ebd254-9b08-41b4-bd75-06a3a045be62-config\") pod \"goldmane-7988f88666-574qn\" (UID: \"b3ebd254-9b08-41b4-bd75-06a3a045be62\") " pod="calico-system/goldmane-7988f88666-574qn" Sep 4 15:50:41.142323 systemd[1]: Created slice kubepods-burstable-pod501a1bfe_524e_4e97_9ddd_148de8480fe6.slice - libcontainer container kubepods-burstable-pod501a1bfe_524e_4e97_9ddd_148de8480fe6.slice. Sep 4 15:50:41.151452 systemd[1]: Created slice kubepods-besteffort-pod058617f0_a421_4b0b_af75_94fdfe834c56.slice - libcontainer container kubepods-besteffort-pod058617f0_a421_4b0b_af75_94fdfe834c56.slice. Sep 4 15:50:41.159032 systemd[1]: Created slice kubepods-burstable-pod6e0b6dd2_4330_4e52_b6e5_a0f1fbd16ac7.slice - libcontainer container kubepods-burstable-pod6e0b6dd2_4330_4e52_b6e5_a0f1fbd16ac7.slice. Sep 4 15:50:41.167770 systemd[1]: Created slice kubepods-besteffort-podec3c0434_a74a_4cd8_aad6_25051cea8b8a.slice - libcontainer container kubepods-besteffort-podec3c0434_a74a_4cd8_aad6_25051cea8b8a.slice. Sep 4 15:50:41.173355 systemd[1]: Created slice kubepods-besteffort-podf699e642_1b05_433e_bd2f_a664a37b4e72.slice - libcontainer container kubepods-besteffort-podf699e642_1b05_433e_bd2f_a664a37b4e72.slice. Sep 4 15:50:41.240066 kubelet[2944]: I0904 15:50:41.240018 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7-config-volume\") pod \"coredns-7c65d6cfc9-s5h4d\" (UID: \"6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7\") " pod="kube-system/coredns-7c65d6cfc9-s5h4d" Sep 4 15:50:41.240066 kubelet[2944]: I0904 15:50:41.240049 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzx7g\" (UniqueName: \"kubernetes.io/projected/f699e642-1b05-433e-bd2f-a664a37b4e72-kube-api-access-tzx7g\") pod \"whisker-5cfff58578-rctfv\" (UID: \"f699e642-1b05-433e-bd2f-a664a37b4e72\") " pod="calico-system/whisker-5cfff58578-rctfv" Sep 4 15:50:41.240386 kubelet[2944]: I0904 15:50:41.240249 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7l8b\" (UniqueName: \"kubernetes.io/projected/ec3c0434-a74a-4cd8-aad6-25051cea8b8a-kube-api-access-v7l8b\") pod \"calico-apiserver-d4584f895-4pckt\" (UID: \"ec3c0434-a74a-4cd8-aad6-25051cea8b8a\") " pod="calico-apiserver/calico-apiserver-d4584f895-4pckt" Sep 4 15:50:41.240386 kubelet[2944]: I0904 15:50:41.240278 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngbfd\" (UniqueName: \"kubernetes.io/projected/6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7-kube-api-access-ngbfd\") pod \"coredns-7c65d6cfc9-s5h4d\" (UID: \"6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7\") " pod="kube-system/coredns-7c65d6cfc9-s5h4d" Sep 4 15:50:41.240386 kubelet[2944]: I0904 15:50:41.240303 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ec3c0434-a74a-4cd8-aad6-25051cea8b8a-calico-apiserver-certs\") pod \"calico-apiserver-d4584f895-4pckt\" (UID: \"ec3c0434-a74a-4cd8-aad6-25051cea8b8a\") " pod="calico-apiserver/calico-apiserver-d4584f895-4pckt" Sep 4 15:50:41.240386 kubelet[2944]: I0904 15:50:41.240319 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f699e642-1b05-433e-bd2f-a664a37b4e72-whisker-ca-bundle\") pod \"whisker-5cfff58578-rctfv\" (UID: \"f699e642-1b05-433e-bd2f-a664a37b4e72\") " pod="calico-system/whisker-5cfff58578-rctfv" Sep 4 15:50:41.240386 kubelet[2944]: I0904 15:50:41.240359 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f699e642-1b05-433e-bd2f-a664a37b4e72-whisker-backend-key-pair\") pod \"whisker-5cfff58578-rctfv\" (UID: \"f699e642-1b05-433e-bd2f-a664a37b4e72\") " pod="calico-system/whisker-5cfff58578-rctfv" Sep 4 15:50:41.324753 containerd[1644]: time="2025-09-04T15:50:41.324677703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 4 15:50:41.428033 containerd[1644]: time="2025-09-04T15:50:41.428004674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7454479f5c-2sn5z,Uid:008bcc6d-c796-4431-bff8-3bdfe6b21bb6,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:41.437230 containerd[1644]: time="2025-09-04T15:50:41.437005268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-574qn,Uid:b3ebd254-9b08-41b4-bd75-06a3a045be62,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:41.453816 containerd[1644]: time="2025-09-04T15:50:41.453794421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-l6fg8,Uid:501a1bfe-524e-4e97-9ddd-148de8480fe6,Namespace:kube-system,Attempt:0,}" Sep 4 15:50:41.457608 containerd[1644]: time="2025-09-04T15:50:41.457510277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d4584f895-xqpkf,Uid:058617f0-a421-4b0b-af75-94fdfe834c56,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:50:41.471513 containerd[1644]: time="2025-09-04T15:50:41.471426828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d4584f895-4pckt,Uid:ec3c0434-a74a-4cd8-aad6-25051cea8b8a,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:50:41.473049 containerd[1644]: time="2025-09-04T15:50:41.472985222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-s5h4d,Uid:6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7,Namespace:kube-system,Attempt:0,}" Sep 4 15:50:41.478438 containerd[1644]: time="2025-09-04T15:50:41.478412689Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cfff58578-rctfv,Uid:f699e642-1b05-433e-bd2f-a664a37b4e72,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:41.736515 containerd[1644]: time="2025-09-04T15:50:41.736434204Z" level=error msg="Failed to destroy network for sandbox \"20442f2b58ecf9f831d0b31fa3900979b8ec4ebea14722c0f034c12685ad5e90\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.741271 containerd[1644]: time="2025-09-04T15:50:41.741252698Z" level=error msg="Failed to destroy network for sandbox \"fd4f1d8b517acaaae3e92a86b48ecd5534da4e56dd08a5c8af26d12c7723248e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.742451 containerd[1644]: time="2025-09-04T15:50:41.742428895Z" level=error msg="Failed to destroy network for sandbox \"f5e42462c09d025d4a35d908dfa46debcde62a39d043346fc02e52162092abec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.745046 containerd[1644]: time="2025-09-04T15:50:41.744984860Z" level=error msg="Failed to destroy network for sandbox \"d82a6defd305cf8b64c5d53fcf028643190b8bc5968ae6a0eca5a91eb4d78137\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.746662 containerd[1644]: time="2025-09-04T15:50:41.746474080Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7454479f5c-2sn5z,Uid:008bcc6d-c796-4431-bff8-3bdfe6b21bb6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"20442f2b58ecf9f831d0b31fa3900979b8ec4ebea14722c0f034c12685ad5e90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.751175 containerd[1644]: time="2025-09-04T15:50:41.751158800Z" level=error msg="Failed to destroy network for sandbox \"f9a280bd70569e7d57957f36208712893ab89f31407ab32a9911b584050ba95e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.756186 kubelet[2944]: E0904 15:50:41.755945 2944 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20442f2b58ecf9f831d0b31fa3900979b8ec4ebea14722c0f034c12685ad5e90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.757134 kubelet[2944]: E0904 15:50:41.757008 2944 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20442f2b58ecf9f831d0b31fa3900979b8ec4ebea14722c0f034c12685ad5e90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7454479f5c-2sn5z" Sep 4 15:50:41.757134 kubelet[2944]: E0904 15:50:41.757029 2944 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20442f2b58ecf9f831d0b31fa3900979b8ec4ebea14722c0f034c12685ad5e90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7454479f5c-2sn5z" Sep 4 15:50:41.757134 kubelet[2944]: E0904 15:50:41.757059 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7454479f5c-2sn5z_calico-system(008bcc6d-c796-4431-bff8-3bdfe6b21bb6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7454479f5c-2sn5z_calico-system(008bcc6d-c796-4431-bff8-3bdfe6b21bb6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"20442f2b58ecf9f831d0b31fa3900979b8ec4ebea14722c0f034c12685ad5e90\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7454479f5c-2sn5z" podUID="008bcc6d-c796-4431-bff8-3bdfe6b21bb6" Sep 4 15:50:41.761573 containerd[1644]: time="2025-09-04T15:50:41.761486300Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cfff58578-rctfv,Uid:f699e642-1b05-433e-bd2f-a664a37b4e72,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd4f1d8b517acaaae3e92a86b48ecd5534da4e56dd08a5c8af26d12c7723248e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.769905 containerd[1644]: time="2025-09-04T15:50:41.769686422Z" level=error msg="Failed to destroy network for sandbox \"2112a5fa08a9a347206ba345108390d1f2ca29cb6f4df0ccb0e37530751f3646\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.770379 containerd[1644]: time="2025-09-04T15:50:41.770270051Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-574qn,Uid:b3ebd254-9b08-41b4-bd75-06a3a045be62,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e42462c09d025d4a35d908dfa46debcde62a39d043346fc02e52162092abec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.770455 kubelet[2944]: E0904 15:50:41.770274 2944 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd4f1d8b517acaaae3e92a86b48ecd5534da4e56dd08a5c8af26d12c7723248e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.770455 kubelet[2944]: E0904 15:50:41.770307 2944 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd4f1d8b517acaaae3e92a86b48ecd5534da4e56dd08a5c8af26d12c7723248e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5cfff58578-rctfv" Sep 4 15:50:41.770455 kubelet[2944]: E0904 15:50:41.770323 2944 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd4f1d8b517acaaae3e92a86b48ecd5534da4e56dd08a5c8af26d12c7723248e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5cfff58578-rctfv" Sep 4 15:50:41.770749 kubelet[2944]: E0904 15:50:41.770347 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5cfff58578-rctfv_calico-system(f699e642-1b05-433e-bd2f-a664a37b4e72)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5cfff58578-rctfv_calico-system(f699e642-1b05-433e-bd2f-a664a37b4e72)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fd4f1d8b517acaaae3e92a86b48ecd5534da4e56dd08a5c8af26d12c7723248e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5cfff58578-rctfv" podUID="f699e642-1b05-433e-bd2f-a664a37b4e72" Sep 4 15:50:41.771374 containerd[1644]: time="2025-09-04T15:50:41.770639341Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-s5h4d,Uid:6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d82a6defd305cf8b64c5d53fcf028643190b8bc5968ae6a0eca5a91eb4d78137\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.771374 containerd[1644]: time="2025-09-04T15:50:41.770947470Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d4584f895-4pckt,Uid:ec3c0434-a74a-4cd8-aad6-25051cea8b8a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9a280bd70569e7d57957f36208712893ab89f31407ab32a9911b584050ba95e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.771374 containerd[1644]: time="2025-09-04T15:50:41.771180937Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d4584f895-xqpkf,Uid:058617f0-a421-4b0b-af75-94fdfe834c56,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2112a5fa08a9a347206ba345108390d1f2ca29cb6f4df0ccb0e37530751f3646\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.771374 containerd[1644]: time="2025-09-04T15:50:41.771234264Z" level=error msg="Failed to destroy network for sandbox \"8fa53edb4db786803e4cd35c96f789920a8932bc724a70256421581d1a435a13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.771527 kubelet[2944]: E0904 15:50:41.771210 2944 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9a280bd70569e7d57957f36208712893ab89f31407ab32a9911b584050ba95e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.771527 kubelet[2944]: E0904 15:50:41.771228 2944 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9a280bd70569e7d57957f36208712893ab89f31407ab32a9911b584050ba95e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d4584f895-4pckt" Sep 4 15:50:41.771527 kubelet[2944]: E0904 15:50:41.771251 2944 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e42462c09d025d4a35d908dfa46debcde62a39d043346fc02e52162092abec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.771527 kubelet[2944]: E0904 15:50:41.771269 2944 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e42462c09d025d4a35d908dfa46debcde62a39d043346fc02e52162092abec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-574qn" Sep 4 15:50:41.772138 kubelet[2944]: E0904 15:50:41.771278 2944 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e42462c09d025d4a35d908dfa46debcde62a39d043346fc02e52162092abec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-574qn" Sep 4 15:50:41.772138 kubelet[2944]: E0904 15:50:41.771309 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-574qn_calico-system(b3ebd254-9b08-41b4-bd75-06a3a045be62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-574qn_calico-system(b3ebd254-9b08-41b4-bd75-06a3a045be62)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5e42462c09d025d4a35d908dfa46debcde62a39d043346fc02e52162092abec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-574qn" podUID="b3ebd254-9b08-41b4-bd75-06a3a045be62" Sep 4 15:50:41.772138 kubelet[2944]: E0904 15:50:41.771336 2944 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d82a6defd305cf8b64c5d53fcf028643190b8bc5968ae6a0eca5a91eb4d78137\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.772838 containerd[1644]: time="2025-09-04T15:50:41.771812394Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-l6fg8,Uid:501a1bfe-524e-4e97-9ddd-148de8480fe6,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fa53edb4db786803e4cd35c96f789920a8932bc724a70256421581d1a435a13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.772884 kubelet[2944]: E0904 15:50:41.771349 2944 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d82a6defd305cf8b64c5d53fcf028643190b8bc5968ae6a0eca5a91eb4d78137\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-s5h4d" Sep 4 15:50:41.772884 kubelet[2944]: E0904 15:50:41.771358 2944 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d82a6defd305cf8b64c5d53fcf028643190b8bc5968ae6a0eca5a91eb4d78137\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-s5h4d" Sep 4 15:50:41.772884 kubelet[2944]: E0904 15:50:41.771371 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-s5h4d_kube-system(6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-s5h4d_kube-system(6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d82a6defd305cf8b64c5d53fcf028643190b8bc5968ae6a0eca5a91eb4d78137\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-s5h4d" podUID="6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7" Sep 4 15:50:41.772967 kubelet[2944]: E0904 15:50:41.771389 2944 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9a280bd70569e7d57957f36208712893ab89f31407ab32a9911b584050ba95e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d4584f895-4pckt" Sep 4 15:50:41.772967 kubelet[2944]: E0904 15:50:41.771409 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-d4584f895-4pckt_calico-apiserver(ec3c0434-a74a-4cd8-aad6-25051cea8b8a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-d4584f895-4pckt_calico-apiserver(ec3c0434-a74a-4cd8-aad6-25051cea8b8a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f9a280bd70569e7d57957f36208712893ab89f31407ab32a9911b584050ba95e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-d4584f895-4pckt" podUID="ec3c0434-a74a-4cd8-aad6-25051cea8b8a" Sep 4 15:50:41.772967 kubelet[2944]: E0904 15:50:41.771451 2944 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2112a5fa08a9a347206ba345108390d1f2ca29cb6f4df0ccb0e37530751f3646\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.773406 kubelet[2944]: E0904 15:50:41.771462 2944 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2112a5fa08a9a347206ba345108390d1f2ca29cb6f4df0ccb0e37530751f3646\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d4584f895-xqpkf" Sep 4 15:50:41.773406 kubelet[2944]: E0904 15:50:41.771471 2944 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2112a5fa08a9a347206ba345108390d1f2ca29cb6f4df0ccb0e37530751f3646\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d4584f895-xqpkf" Sep 4 15:50:41.773406 kubelet[2944]: E0904 15:50:41.771485 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-d4584f895-xqpkf_calico-apiserver(058617f0-a421-4b0b-af75-94fdfe834c56)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-d4584f895-xqpkf_calico-apiserver(058617f0-a421-4b0b-af75-94fdfe834c56)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2112a5fa08a9a347206ba345108390d1f2ca29cb6f4df0ccb0e37530751f3646\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-d4584f895-xqpkf" podUID="058617f0-a421-4b0b-af75-94fdfe834c56" Sep 4 15:50:41.773487 kubelet[2944]: E0904 15:50:41.771898 2944 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fa53edb4db786803e4cd35c96f789920a8932bc724a70256421581d1a435a13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:41.773487 kubelet[2944]: E0904 15:50:41.771917 2944 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fa53edb4db786803e4cd35c96f789920a8932bc724a70256421581d1a435a13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-l6fg8" Sep 4 15:50:41.773487 kubelet[2944]: E0904 15:50:41.771978 2944 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fa53edb4db786803e4cd35c96f789920a8932bc724a70256421581d1a435a13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-l6fg8" Sep 4 15:50:41.773590 kubelet[2944]: E0904 15:50:41.772005 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-l6fg8_kube-system(501a1bfe-524e-4e97-9ddd-148de8480fe6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-l6fg8_kube-system(501a1bfe-524e-4e97-9ddd-148de8480fe6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8fa53edb4db786803e4cd35c96f789920a8932bc724a70256421581d1a435a13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-l6fg8" podUID="501a1bfe-524e-4e97-9ddd-148de8480fe6" Sep 4 15:50:43.029597 systemd[1]: Created slice kubepods-besteffort-pod983a0613_198a_42b5_9c35_64b2a2869e6c.slice - libcontainer container kubepods-besteffort-pod983a0613_198a_42b5_9c35_64b2a2869e6c.slice. Sep 4 15:50:43.030924 containerd[1644]: time="2025-09-04T15:50:43.030905630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x2d7v,Uid:983a0613-198a-42b5-9c35-64b2a2869e6c,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:43.083762 containerd[1644]: time="2025-09-04T15:50:43.083731299Z" level=error msg="Failed to destroy network for sandbox \"a315dcca74daaef040edb8f97861245dbd4bf95e52d92dcb70ea3a522513e729\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:43.085826 systemd[1]: run-netns-cni\x2d77cf990f\x2d792a\x2de0c4\x2d4d7b\x2d375aadf23ff8.mount: Deactivated successfully. Sep 4 15:50:43.093705 containerd[1644]: time="2025-09-04T15:50:43.093617732Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x2d7v,Uid:983a0613-198a-42b5-9c35-64b2a2869e6c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a315dcca74daaef040edb8f97861245dbd4bf95e52d92dcb70ea3a522513e729\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:43.094157 kubelet[2944]: E0904 15:50:43.093895 2944 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a315dcca74daaef040edb8f97861245dbd4bf95e52d92dcb70ea3a522513e729\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:50:43.094157 kubelet[2944]: E0904 15:50:43.093933 2944 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a315dcca74daaef040edb8f97861245dbd4bf95e52d92dcb70ea3a522513e729\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x2d7v" Sep 4 15:50:43.094157 kubelet[2944]: E0904 15:50:43.093948 2944 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a315dcca74daaef040edb8f97861245dbd4bf95e52d92dcb70ea3a522513e729\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x2d7v" Sep 4 15:50:43.094393 kubelet[2944]: E0904 15:50:43.093977 2944 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x2d7v_calico-system(983a0613-198a-42b5-9c35-64b2a2869e6c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x2d7v_calico-system(983a0613-198a-42b5-9c35-64b2a2869e6c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a315dcca74daaef040edb8f97861245dbd4bf95e52d92dcb70ea3a522513e729\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x2d7v" podUID="983a0613-198a-42b5-9c35-64b2a2869e6c" Sep 4 15:50:52.693814 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount832484894.mount: Deactivated successfully. Sep 4 15:50:53.019806 containerd[1644]: time="2025-09-04T15:50:53.019723236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 4 15:50:53.036129 containerd[1644]: time="2025-09-04T15:50:53.036077225Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:53.055523 containerd[1644]: time="2025-09-04T15:50:53.055499389Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:53.063633 containerd[1644]: time="2025-09-04T15:50:53.063603274Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:50:53.065043 containerd[1644]: time="2025-09-04T15:50:53.065005115Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 11.738818891s" Sep 4 15:50:53.065043 containerd[1644]: time="2025-09-04T15:50:53.065025257Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 4 15:50:53.086589 containerd[1644]: time="2025-09-04T15:50:53.086278400Z" level=info msg="CreateContainer within sandbox \"e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 4 15:50:53.136607 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount884937343.mount: Deactivated successfully. Sep 4 15:50:53.136874 containerd[1644]: time="2025-09-04T15:50:53.136850911Z" level=info msg="Container 0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:50:53.196780 containerd[1644]: time="2025-09-04T15:50:53.196747792Z" level=info msg="CreateContainer within sandbox \"e0730e2bff164b9d3ddd755678553c43e02bbe391c52907e738e1d048b32a383\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\"" Sep 4 15:50:53.197340 containerd[1644]: time="2025-09-04T15:50:53.197324331Z" level=info msg="StartContainer for \"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\"" Sep 4 15:50:53.202420 containerd[1644]: time="2025-09-04T15:50:53.202399788Z" level=info msg="connecting to shim 0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322" address="unix:///run/containerd/s/b13b44512d017b7cb6da37d8a85f112dd9100662f4913ab5caff43dcb30a8824" protocol=ttrpc version=3 Sep 4 15:50:53.287363 systemd[1]: Started cri-containerd-0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322.scope - libcontainer container 0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322. Sep 4 15:50:53.318954 containerd[1644]: time="2025-09-04T15:50:53.318932970Z" level=info msg="StartContainer for \"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\" returns successfully" Sep 4 15:50:53.452065 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 4 15:50:53.457144 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 4 15:50:53.524957 containerd[1644]: time="2025-09-04T15:50:53.524779947Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\" id:\"5ee26bd2a15e8d1cafb89b2fe9b1796d02981a9a51b80bd0f6541227350aa361\" pid:4051 exit_status:1 exited_at:{seconds:1757001053 nanos:524216928}" Sep 4 15:50:53.615024 containerd[1644]: time="2025-09-04T15:50:53.614757972Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\" id:\"23ba89bb8e407ce7dad2b8d1e95e75b18784d38bad8a21353b5b4fd7ce9f6050\" pid:4077 exit_status:1 exited_at:{seconds:1757001053 nanos:614540838}" Sep 4 15:50:53.705511 kubelet[2944]: I0904 15:50:53.705331 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-dvtfk" podStartSLOduration=1.800342079 podStartE2EDuration="38.705312835s" podCreationTimestamp="2025-09-04 15:50:15 +0000 UTC" firstStartedPulling="2025-09-04 15:50:16.162037093 +0000 UTC m=+18.226416091" lastFinishedPulling="2025-09-04 15:50:53.067007853 +0000 UTC m=+55.131386847" observedRunningTime="2025-09-04 15:50:53.368217685 +0000 UTC m=+55.432596687" watchObservedRunningTime="2025-09-04 15:50:53.705312835 +0000 UTC m=+55.769691840" Sep 4 15:50:53.815091 kubelet[2944]: I0904 15:50:53.814957 2944 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f699e642-1b05-433e-bd2f-a664a37b4e72-whisker-ca-bundle\") pod \"f699e642-1b05-433e-bd2f-a664a37b4e72\" (UID: \"f699e642-1b05-433e-bd2f-a664a37b4e72\") " Sep 4 15:50:53.815424 kubelet[2944]: I0904 15:50:53.815315 2944 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f699e642-1b05-433e-bd2f-a664a37b4e72-whisker-backend-key-pair\") pod \"f699e642-1b05-433e-bd2f-a664a37b4e72\" (UID: \"f699e642-1b05-433e-bd2f-a664a37b4e72\") " Sep 4 15:50:53.815424 kubelet[2944]: I0904 15:50:53.815338 2944 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzx7g\" (UniqueName: \"kubernetes.io/projected/f699e642-1b05-433e-bd2f-a664a37b4e72-kube-api-access-tzx7g\") pod \"f699e642-1b05-433e-bd2f-a664a37b4e72\" (UID: \"f699e642-1b05-433e-bd2f-a664a37b4e72\") " Sep 4 15:50:53.825674 kubelet[2944]: I0904 15:50:53.825206 2944 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f699e642-1b05-433e-bd2f-a664a37b4e72-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "f699e642-1b05-433e-bd2f-a664a37b4e72" (UID: "f699e642-1b05-433e-bd2f-a664a37b4e72"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 4 15:50:53.828856 systemd[1]: var-lib-kubelet-pods-f699e642\x2d1b05\x2d433e\x2dbd2f\x2da664a37b4e72-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtzx7g.mount: Deactivated successfully. Sep 4 15:50:53.832564 kubelet[2944]: I0904 15:50:53.832019 2944 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f699e642-1b05-433e-bd2f-a664a37b4e72-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "f699e642-1b05-433e-bd2f-a664a37b4e72" (UID: "f699e642-1b05-433e-bd2f-a664a37b4e72"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 4 15:50:53.832844 systemd[1]: var-lib-kubelet-pods-f699e642\x2d1b05\x2d433e\x2dbd2f\x2da664a37b4e72-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 4 15:50:53.833991 kubelet[2944]: I0904 15:50:53.833652 2944 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f699e642-1b05-433e-bd2f-a664a37b4e72-kube-api-access-tzx7g" (OuterVolumeSpecName: "kube-api-access-tzx7g") pod "f699e642-1b05-433e-bd2f-a664a37b4e72" (UID: "f699e642-1b05-433e-bd2f-a664a37b4e72"). InnerVolumeSpecName "kube-api-access-tzx7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 4 15:50:53.916408 kubelet[2944]: I0904 15:50:53.916378 2944 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f699e642-1b05-433e-bd2f-a664a37b4e72-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 4 15:50:53.916498 kubelet[2944]: I0904 15:50:53.916405 2944 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzx7g\" (UniqueName: \"kubernetes.io/projected/f699e642-1b05-433e-bd2f-a664a37b4e72-kube-api-access-tzx7g\") on node \"localhost\" DevicePath \"\"" Sep 4 15:50:53.916498 kubelet[2944]: I0904 15:50:53.916435 2944 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f699e642-1b05-433e-bd2f-a664a37b4e72-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 4 15:50:54.028223 containerd[1644]: time="2025-09-04T15:50:54.028187894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-s5h4d,Uid:6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7,Namespace:kube-system,Attempt:0,}" Sep 4 15:50:54.035672 systemd[1]: Removed slice kubepods-besteffort-podf699e642_1b05_433e_bd2f_a664a37b4e72.slice - libcontainer container kubepods-besteffort-podf699e642_1b05_433e_bd2f_a664a37b4e72.slice. Sep 4 15:50:54.433931 systemd[1]: Created slice kubepods-besteffort-podefb27c17_92cb_4aca_a588_a601b24c7512.slice - libcontainer container kubepods-besteffort-podefb27c17_92cb_4aca_a588_a601b24c7512.slice. Sep 4 15:50:54.470440 containerd[1644]: time="2025-09-04T15:50:54.470411167Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\" id:\"8c0e0bd9e9afb1e9475b9fa66cd035f49c4353fbc0f296160eeb29607d5474a5\" pid:4138 exit_status:1 exited_at:{seconds:1757001054 nanos:470158596}" Sep 4 15:50:54.520075 kubelet[2944]: I0904 15:50:54.520031 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efb27c17-92cb-4aca-a588-a601b24c7512-whisker-ca-bundle\") pod \"whisker-5ff586f7f7-s9x8z\" (UID: \"efb27c17-92cb-4aca-a588-a601b24c7512\") " pod="calico-system/whisker-5ff586f7f7-s9x8z" Sep 4 15:50:54.520075 kubelet[2944]: I0904 15:50:54.520059 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chfl4\" (UniqueName: \"kubernetes.io/projected/efb27c17-92cb-4aca-a588-a601b24c7512-kube-api-access-chfl4\") pod \"whisker-5ff586f7f7-s9x8z\" (UID: \"efb27c17-92cb-4aca-a588-a601b24c7512\") " pod="calico-system/whisker-5ff586f7f7-s9x8z" Sep 4 15:50:54.520075 kubelet[2944]: I0904 15:50:54.520075 2944 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/efb27c17-92cb-4aca-a588-a601b24c7512-whisker-backend-key-pair\") pod \"whisker-5ff586f7f7-s9x8z\" (UID: \"efb27c17-92cb-4aca-a588-a601b24c7512\") " pod="calico-system/whisker-5ff586f7f7-s9x8z" Sep 4 15:50:54.529652 systemd-networkd[1532]: cali4671790e931: Link UP Sep 4 15:50:54.529788 systemd-networkd[1532]: cali4671790e931: Gained carrier Sep 4 15:50:54.538576 containerd[1644]: 2025-09-04 15:50:54.063 [INFO][4110] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 15:50:54.538576 containerd[1644]: 2025-09-04 15:50:54.098 [INFO][4110] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0 coredns-7c65d6cfc9- kube-system 6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7 837 0 2025-09-04 15:50:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-s5h4d eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4671790e931 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-s5h4d" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--s5h4d-" Sep 4 15:50:54.538576 containerd[1644]: 2025-09-04 15:50:54.099 [INFO][4110] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-s5h4d" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" Sep 4 15:50:54.538576 containerd[1644]: 2025-09-04 15:50:54.483 [INFO][4121] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" HandleID="k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Workload="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.485 [INFO][4121] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" HandleID="k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Workload="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00040c1d0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-s5h4d", "timestamp":"2025-09-04 15:50:54.483143592 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.485 [INFO][4121] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.485 [INFO][4121] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.486 [INFO][4121] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.498 [INFO][4121] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" host="localhost" Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.507 [INFO][4121] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.509 [INFO][4121] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.511 [INFO][4121] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.512 [INFO][4121] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:54.540865 containerd[1644]: 2025-09-04 15:50:54.512 [INFO][4121] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" host="localhost" Sep 4 15:50:54.541060 containerd[1644]: 2025-09-04 15:50:54.512 [INFO][4121] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4 Sep 4 15:50:54.541060 containerd[1644]: 2025-09-04 15:50:54.514 [INFO][4121] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" host="localhost" Sep 4 15:50:54.541060 containerd[1644]: 2025-09-04 15:50:54.517 [INFO][4121] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" host="localhost" Sep 4 15:50:54.541060 containerd[1644]: 2025-09-04 15:50:54.518 [INFO][4121] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" host="localhost" Sep 4 15:50:54.541060 containerd[1644]: 2025-09-04 15:50:54.518 [INFO][4121] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:50:54.541060 containerd[1644]: 2025-09-04 15:50:54.518 [INFO][4121] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" HandleID="k8s-pod-network.69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Workload="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" Sep 4 15:50:54.541630 containerd[1644]: 2025-09-04 15:50:54.520 [INFO][4110] cni-plugin/k8s.go 418: Populated endpoint ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-s5h4d" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-s5h4d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4671790e931", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:54.541705 containerd[1644]: 2025-09-04 15:50:54.520 [INFO][4110] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-s5h4d" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" Sep 4 15:50:54.541705 containerd[1644]: 2025-09-04 15:50:54.520 [INFO][4110] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4671790e931 ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-s5h4d" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" Sep 4 15:50:54.541705 containerd[1644]: 2025-09-04 15:50:54.530 [INFO][4110] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-s5h4d" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" Sep 4 15:50:54.542788 containerd[1644]: 2025-09-04 15:50:54.530 [INFO][4110] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-s5h4d" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4", Pod:"coredns-7c65d6cfc9-s5h4d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4671790e931", MAC:"0e:80:b7:d4:7e:e5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:54.542788 containerd[1644]: 2025-09-04 15:50:54.537 [INFO][4110] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-s5h4d" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--s5h4d-eth0" Sep 4 15:50:54.626111 containerd[1644]: time="2025-09-04T15:50:54.625785369Z" level=info msg="connecting to shim 69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4" address="unix:///run/containerd/s/935ed1cf1b80354d1b6ea05ed763ae60f3ee11c6a6c6d6085948d613c0512bb6" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:54.646355 systemd[1]: Started cri-containerd-69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4.scope - libcontainer container 69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4. Sep 4 15:50:54.654543 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:50:54.680956 containerd[1644]: time="2025-09-04T15:50:54.680921920Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-s5h4d,Uid:6e0b6dd2-4330-4e52-b6e5-a0f1fbd16ac7,Namespace:kube-system,Attempt:0,} returns sandbox id \"69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4\"" Sep 4 15:50:54.682742 containerd[1644]: time="2025-09-04T15:50:54.682693964Z" level=info msg="CreateContainer within sandbox \"69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 15:50:54.701194 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3537015625.mount: Deactivated successfully. Sep 4 15:50:54.701817 containerd[1644]: time="2025-09-04T15:50:54.701757315Z" level=info msg="Container ac11660d3fbca971a77340816d62a94496ed97dd60cc0343688320e196f64255: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:50:54.705693 containerd[1644]: time="2025-09-04T15:50:54.705675145Z" level=info msg="CreateContainer within sandbox \"69d10f09c3fb28e81e80ea44c354cdace19b7293bb566fd55c413e9984a652b4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ac11660d3fbca971a77340816d62a94496ed97dd60cc0343688320e196f64255\"" Sep 4 15:50:54.706676 containerd[1644]: time="2025-09-04T15:50:54.706161167Z" level=info msg="StartContainer for \"ac11660d3fbca971a77340816d62a94496ed97dd60cc0343688320e196f64255\"" Sep 4 15:50:54.706676 containerd[1644]: time="2025-09-04T15:50:54.706566787Z" level=info msg="connecting to shim ac11660d3fbca971a77340816d62a94496ed97dd60cc0343688320e196f64255" address="unix:///run/containerd/s/935ed1cf1b80354d1b6ea05ed763ae60f3ee11c6a6c6d6085948d613c0512bb6" protocol=ttrpc version=3 Sep 4 15:50:54.720194 systemd[1]: Started cri-containerd-ac11660d3fbca971a77340816d62a94496ed97dd60cc0343688320e196f64255.scope - libcontainer container ac11660d3fbca971a77340816d62a94496ed97dd60cc0343688320e196f64255. Sep 4 15:50:54.738067 containerd[1644]: time="2025-09-04T15:50:54.738033676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5ff586f7f7-s9x8z,Uid:efb27c17-92cb-4aca-a588-a601b24c7512,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:54.741365 containerd[1644]: time="2025-09-04T15:50:54.741348352Z" level=info msg="StartContainer for \"ac11660d3fbca971a77340816d62a94496ed97dd60cc0343688320e196f64255\" returns successfully" Sep 4 15:50:54.804353 systemd-networkd[1532]: cali214b238ae73: Link UP Sep 4 15:50:54.804471 systemd-networkd[1532]: cali214b238ae73: Gained carrier Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.756 [INFO][4232] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.762 [INFO][4232] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0 whisker-5ff586f7f7- calico-system efb27c17-92cb-4aca-a588-a601b24c7512 913 0 2025-09-04 15:50:54 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5ff586f7f7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5ff586f7f7-s9x8z eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali214b238ae73 [] [] }} ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Namespace="calico-system" Pod="whisker-5ff586f7f7-s9x8z" WorkloadEndpoint="localhost-k8s-whisker--5ff586f7f7--s9x8z-" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.762 [INFO][4232] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Namespace="calico-system" Pod="whisker-5ff586f7f7-s9x8z" WorkloadEndpoint="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.781 [INFO][4243] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" HandleID="k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Workload="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.781 [INFO][4243] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" HandleID="k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Workload="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5ff586f7f7-s9x8z", "timestamp":"2025-09-04 15:50:54.78128439 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.781 [INFO][4243] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.781 [INFO][4243] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.781 [INFO][4243] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.785 [INFO][4243] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.788 [INFO][4243] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.791 [INFO][4243] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.792 [INFO][4243] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.793 [INFO][4243] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.793 [INFO][4243] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.794 [INFO][4243] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.796 [INFO][4243] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.798 [INFO][4243] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.798 [INFO][4243] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" host="localhost" Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.798 [INFO][4243] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:50:54.813654 containerd[1644]: 2025-09-04 15:50:54.798 [INFO][4243] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" HandleID="k8s-pod-network.2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Workload="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" Sep 4 15:50:54.815765 containerd[1644]: 2025-09-04 15:50:54.801 [INFO][4232] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Namespace="calico-system" Pod="whisker-5ff586f7f7-s9x8z" WorkloadEndpoint="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0", GenerateName:"whisker-5ff586f7f7-", Namespace:"calico-system", SelfLink:"", UID:"efb27c17-92cb-4aca-a588-a601b24c7512", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5ff586f7f7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5ff586f7f7-s9x8z", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali214b238ae73", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:54.815765 containerd[1644]: 2025-09-04 15:50:54.801 [INFO][4232] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Namespace="calico-system" Pod="whisker-5ff586f7f7-s9x8z" WorkloadEndpoint="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" Sep 4 15:50:54.815765 containerd[1644]: 2025-09-04 15:50:54.801 [INFO][4232] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali214b238ae73 ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Namespace="calico-system" Pod="whisker-5ff586f7f7-s9x8z" WorkloadEndpoint="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" Sep 4 15:50:54.815765 containerd[1644]: 2025-09-04 15:50:54.804 [INFO][4232] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Namespace="calico-system" Pod="whisker-5ff586f7f7-s9x8z" WorkloadEndpoint="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" Sep 4 15:50:54.815765 containerd[1644]: 2025-09-04 15:50:54.805 [INFO][4232] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Namespace="calico-system" Pod="whisker-5ff586f7f7-s9x8z" WorkloadEndpoint="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0", GenerateName:"whisker-5ff586f7f7-", Namespace:"calico-system", SelfLink:"", UID:"efb27c17-92cb-4aca-a588-a601b24c7512", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5ff586f7f7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b", Pod:"whisker-5ff586f7f7-s9x8z", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali214b238ae73", MAC:"5e:02:90:6f:e3:ee", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:54.815765 containerd[1644]: 2025-09-04 15:50:54.812 [INFO][4232] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" Namespace="calico-system" Pod="whisker-5ff586f7f7-s9x8z" WorkloadEndpoint="localhost-k8s-whisker--5ff586f7f7--s9x8z-eth0" Sep 4 15:50:54.827566 containerd[1644]: time="2025-09-04T15:50:54.827535260Z" level=info msg="connecting to shim 2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b" address="unix:///run/containerd/s/9e9a45ba2e00da8a60d953b45b8be0ce494795ca784a11ff7b2396e004f6e357" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:54.843197 systemd[1]: Started cri-containerd-2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b.scope - libcontainer container 2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b. Sep 4 15:50:54.851445 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:50:54.890791 containerd[1644]: time="2025-09-04T15:50:54.890766039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5ff586f7f7-s9x8z,Uid:efb27c17-92cb-4aca-a588-a601b24c7512,Namespace:calico-system,Attempt:0,} returns sandbox id \"2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b\"" Sep 4 15:50:54.891929 containerd[1644]: time="2025-09-04T15:50:54.891895366Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 4 15:50:55.026563 containerd[1644]: time="2025-09-04T15:50:55.026183252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7454479f5c-2sn5z,Uid:008bcc6d-c796-4431-bff8-3bdfe6b21bb6,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:55.027639 containerd[1644]: time="2025-09-04T15:50:55.027553697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-l6fg8,Uid:501a1bfe-524e-4e97-9ddd-148de8480fe6,Namespace:kube-system,Attempt:0,}" Sep 4 15:50:55.027675 containerd[1644]: time="2025-09-04T15:50:55.027653745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d4584f895-xqpkf,Uid:058617f0-a421-4b0b-af75-94fdfe834c56,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:50:55.179950 systemd-networkd[1532]: cali9f8cc8fdcf5: Link UP Sep 4 15:50:55.180590 systemd-networkd[1532]: cali9f8cc8fdcf5: Gained carrier Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.116 [INFO][4395] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.126 [INFO][4395] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0 calico-kube-controllers-7454479f5c- calico-system 008bcc6d-c796-4431-bff8-3bdfe6b21bb6 831 0 2025-09-04 15:50:15 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7454479f5c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7454479f5c-2sn5z eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9f8cc8fdcf5 [] [] }} ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Namespace="calico-system" Pod="calico-kube-controllers-7454479f5c-2sn5z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.126 [INFO][4395] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Namespace="calico-system" Pod="calico-kube-controllers-7454479f5c-2sn5z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.151 [INFO][4411] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" HandleID="k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Workload="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.152 [INFO][4411] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" HandleID="k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Workload="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ac900), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7454479f5c-2sn5z", "timestamp":"2025-09-04 15:50:55.151974049 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.152 [INFO][4411] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.152 [INFO][4411] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.152 [INFO][4411] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.156 [INFO][4411] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.159 [INFO][4411] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.162 [INFO][4411] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.163 [INFO][4411] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.165 [INFO][4411] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.165 [INFO][4411] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.166 [INFO][4411] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.170 [INFO][4411] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.174 [INFO][4411] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.174 [INFO][4411] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" host="localhost" Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.174 [INFO][4411] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:50:55.191053 containerd[1644]: 2025-09-04 15:50:55.174 [INFO][4411] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" HandleID="k8s-pod-network.ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Workload="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" Sep 4 15:50:55.195142 containerd[1644]: 2025-09-04 15:50:55.176 [INFO][4395] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Namespace="calico-system" Pod="calico-kube-controllers-7454479f5c-2sn5z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0", GenerateName:"calico-kube-controllers-7454479f5c-", Namespace:"calico-system", SelfLink:"", UID:"008bcc6d-c796-4431-bff8-3bdfe6b21bb6", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7454479f5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7454479f5c-2sn5z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9f8cc8fdcf5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:55.195142 containerd[1644]: 2025-09-04 15:50:55.176 [INFO][4395] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Namespace="calico-system" Pod="calico-kube-controllers-7454479f5c-2sn5z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" Sep 4 15:50:55.195142 containerd[1644]: 2025-09-04 15:50:55.176 [INFO][4395] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9f8cc8fdcf5 ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Namespace="calico-system" Pod="calico-kube-controllers-7454479f5c-2sn5z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" Sep 4 15:50:55.195142 containerd[1644]: 2025-09-04 15:50:55.180 [INFO][4395] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Namespace="calico-system" Pod="calico-kube-controllers-7454479f5c-2sn5z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" Sep 4 15:50:55.195142 containerd[1644]: 2025-09-04 15:50:55.181 [INFO][4395] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Namespace="calico-system" Pod="calico-kube-controllers-7454479f5c-2sn5z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0", GenerateName:"calico-kube-controllers-7454479f5c-", Namespace:"calico-system", SelfLink:"", UID:"008bcc6d-c796-4431-bff8-3bdfe6b21bb6", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7454479f5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b", Pod:"calico-kube-controllers-7454479f5c-2sn5z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9f8cc8fdcf5", MAC:"36:be:bf:85:84:b9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:55.195142 containerd[1644]: 2025-09-04 15:50:55.187 [INFO][4395] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" Namespace="calico-system" Pod="calico-kube-controllers-7454479f5c-2sn5z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7454479f5c--2sn5z-eth0" Sep 4 15:50:55.286630 containerd[1644]: time="2025-09-04T15:50:55.286422072Z" level=info msg="connecting to shim ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b" address="unix:///run/containerd/s/fc7ea7dd1a638dfb704a0555b241a7487966746c3d18932c1c01578e8317b8e6" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:55.339258 systemd[1]: Started cri-containerd-ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b.scope - libcontainer container ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b. Sep 4 15:50:55.365981 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:50:55.410423 systemd-networkd[1532]: cali599bcf0997b: Link UP Sep 4 15:50:55.414158 systemd-networkd[1532]: cali599bcf0997b: Gained carrier Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.271 [INFO][4425] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.293 [INFO][4425] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0 coredns-7c65d6cfc9- kube-system 501a1bfe-524e-4e97-9ddd-148de8480fe6 833 0 2025-09-04 15:50:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-l6fg8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali599bcf0997b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-l6fg8" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--l6fg8-" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.293 [INFO][4425] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-l6fg8" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.339 [INFO][4483] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" HandleID="k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Workload="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.339 [INFO][4483] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" HandleID="k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Workload="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-l6fg8", "timestamp":"2025-09-04 15:50:55.339067222 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.339 [INFO][4483] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.340 [INFO][4483] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.340 [INFO][4483] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.347 [INFO][4483] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.354 [INFO][4483] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.362 [INFO][4483] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.364 [INFO][4483] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.366 [INFO][4483] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.366 [INFO][4483] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.367 [INFO][4483] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.372 [INFO][4483] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.387 [INFO][4483] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.387 [INFO][4483] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" host="localhost" Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.387 [INFO][4483] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:50:55.433999 containerd[1644]: 2025-09-04 15:50:55.388 [INFO][4483] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" HandleID="k8s-pod-network.7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Workload="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" Sep 4 15:50:55.438075 containerd[1644]: 2025-09-04 15:50:55.398 [INFO][4425] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-l6fg8" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"501a1bfe-524e-4e97-9ddd-148de8480fe6", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-l6fg8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali599bcf0997b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:55.438075 containerd[1644]: 2025-09-04 15:50:55.399 [INFO][4425] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-l6fg8" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" Sep 4 15:50:55.438075 containerd[1644]: 2025-09-04 15:50:55.399 [INFO][4425] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali599bcf0997b ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-l6fg8" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" Sep 4 15:50:55.438075 containerd[1644]: 2025-09-04 15:50:55.413 [INFO][4425] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-l6fg8" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" Sep 4 15:50:55.438075 containerd[1644]: 2025-09-04 15:50:55.415 [INFO][4425] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-l6fg8" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"501a1bfe-524e-4e97-9ddd-148de8480fe6", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b", Pod:"coredns-7c65d6cfc9-l6fg8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali599bcf0997b", MAC:"f6:67:08:46:5d:23", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:55.438075 containerd[1644]: 2025-09-04 15:50:55.427 [INFO][4425] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-l6fg8" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--l6fg8-eth0" Sep 4 15:50:55.451529 containerd[1644]: time="2025-09-04T15:50:55.451021440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7454479f5c-2sn5z,Uid:008bcc6d-c796-4431-bff8-3bdfe6b21bb6,Namespace:calico-system,Attempt:0,} returns sandbox id \"ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b\"" Sep 4 15:50:55.462919 containerd[1644]: time="2025-09-04T15:50:55.462885100Z" level=info msg="connecting to shim 7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b" address="unix:///run/containerd/s/4e141855db5eef8f01d58a5836f3376aae95fd3fee25ca4f4189ff32f2d56005" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:55.515541 systemd[1]: Started cri-containerd-7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b.scope - libcontainer container 7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b. Sep 4 15:50:55.515917 systemd-networkd[1532]: cali27068496d31: Link UP Sep 4 15:50:55.516741 systemd-networkd[1532]: cali27068496d31: Gained carrier Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.293 [INFO][4433] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.308 [INFO][4433] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0 calico-apiserver-d4584f895- calico-apiserver 058617f0-a421-4b0b-af75-94fdfe834c56 836 0 2025-09-04 15:50:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:d4584f895 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-d4584f895-xqpkf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali27068496d31 [] [] }} ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-xqpkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--xqpkf-" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.308 [INFO][4433] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-xqpkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.365 [INFO][4499] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" HandleID="k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Workload="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.365 [INFO][4499] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" HandleID="k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Workload="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024fa50), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-d4584f895-xqpkf", "timestamp":"2025-09-04 15:50:55.365354586 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.365 [INFO][4499] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.389 [INFO][4499] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.389 [INFO][4499] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.448 [INFO][4499] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.457 [INFO][4499] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.464 [INFO][4499] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.466 [INFO][4499] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.471 [INFO][4499] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.471 [INFO][4499] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.485 [INFO][4499] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7 Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.495 [INFO][4499] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.504 [INFO][4499] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.504 [INFO][4499] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" host="localhost" Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.504 [INFO][4499] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:50:55.529692 containerd[1644]: 2025-09-04 15:50:55.504 [INFO][4499] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" HandleID="k8s-pod-network.73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Workload="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" Sep 4 15:50:55.530157 containerd[1644]: 2025-09-04 15:50:55.507 [INFO][4433] cni-plugin/k8s.go 418: Populated endpoint ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-xqpkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0", GenerateName:"calico-apiserver-d4584f895-", Namespace:"calico-apiserver", SelfLink:"", UID:"058617f0-a421-4b0b-af75-94fdfe834c56", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d4584f895", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-d4584f895-xqpkf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali27068496d31", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:55.530157 containerd[1644]: 2025-09-04 15:50:55.507 [INFO][4433] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-xqpkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" Sep 4 15:50:55.530157 containerd[1644]: 2025-09-04 15:50:55.507 [INFO][4433] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali27068496d31 ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-xqpkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" Sep 4 15:50:55.530157 containerd[1644]: 2025-09-04 15:50:55.517 [INFO][4433] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-xqpkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" Sep 4 15:50:55.530157 containerd[1644]: 2025-09-04 15:50:55.517 [INFO][4433] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-xqpkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0", GenerateName:"calico-apiserver-d4584f895-", Namespace:"calico-apiserver", SelfLink:"", UID:"058617f0-a421-4b0b-af75-94fdfe834c56", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d4584f895", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7", Pod:"calico-apiserver-d4584f895-xqpkf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali27068496d31", MAC:"02:4e:9d:9f:5a:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:55.530157 containerd[1644]: 2025-09-04 15:50:55.526 [INFO][4433] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-xqpkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--xqpkf-eth0" Sep 4 15:50:55.535275 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:50:55.553233 containerd[1644]: time="2025-09-04T15:50:55.552409283Z" level=info msg="connecting to shim 73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7" address="unix:///run/containerd/s/8075984f84908e73433aeb27b8a94b9a4252ff78e8fad21bcf58250f15c2cfcc" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:55.585585 systemd[1]: Started cri-containerd-73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7.scope - libcontainer container 73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7. Sep 4 15:50:55.590379 kubelet[2944]: I0904 15:50:55.590096 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-s5h4d" podStartSLOduration=52.590080995 podStartE2EDuration="52.590080995s" podCreationTimestamp="2025-09-04 15:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:50:55.589239653 +0000 UTC m=+57.653618650" watchObservedRunningTime="2025-09-04 15:50:55.590080995 +0000 UTC m=+57.654459994" Sep 4 15:50:55.635561 containerd[1644]: time="2025-09-04T15:50:55.634259936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-l6fg8,Uid:501a1bfe-524e-4e97-9ddd-148de8480fe6,Namespace:kube-system,Attempt:0,} returns sandbox id \"7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b\"" Sep 4 15:50:55.644987 containerd[1644]: time="2025-09-04T15:50:55.644885178Z" level=info msg="CreateContainer within sandbox \"7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 15:50:55.653090 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:50:55.658865 containerd[1644]: time="2025-09-04T15:50:55.658736620Z" level=info msg="Container 0035a294f6f134ec12fcff6f42079a6c3d9c02f171d653708813a52c3a77f0e8: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:50:55.662609 containerd[1644]: time="2025-09-04T15:50:55.662502737Z" level=info msg="CreateContainer within sandbox \"7c7b854b42ed276b7136c1bc0868640059dbfed6334d04f5fba808f43b362e2b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0035a294f6f134ec12fcff6f42079a6c3d9c02f171d653708813a52c3a77f0e8\"" Sep 4 15:50:55.663564 containerd[1644]: time="2025-09-04T15:50:55.663292735Z" level=info msg="StartContainer for \"0035a294f6f134ec12fcff6f42079a6c3d9c02f171d653708813a52c3a77f0e8\"" Sep 4 15:50:55.664694 containerd[1644]: time="2025-09-04T15:50:55.664607251Z" level=info msg="connecting to shim 0035a294f6f134ec12fcff6f42079a6c3d9c02f171d653708813a52c3a77f0e8" address="unix:///run/containerd/s/4e141855db5eef8f01d58a5836f3376aae95fd3fee25ca4f4189ff32f2d56005" protocol=ttrpc version=3 Sep 4 15:50:55.686091 systemd[1]: Started cri-containerd-0035a294f6f134ec12fcff6f42079a6c3d9c02f171d653708813a52c3a77f0e8.scope - libcontainer container 0035a294f6f134ec12fcff6f42079a6c3d9c02f171d653708813a52c3a77f0e8. Sep 4 15:50:55.756374 containerd[1644]: time="2025-09-04T15:50:55.756310652Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d4584f895-xqpkf,Uid:058617f0-a421-4b0b-af75-94fdfe834c56,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7\"" Sep 4 15:50:55.756876 containerd[1644]: time="2025-09-04T15:50:55.756757634Z" level=info msg="StartContainer for \"0035a294f6f134ec12fcff6f42079a6c3d9c02f171d653708813a52c3a77f0e8\" returns successfully" Sep 4 15:50:55.776436 containerd[1644]: time="2025-09-04T15:50:55.776084711Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\" id:\"b5994dcf4bdb5ad69e37867468f86a2d0ba5d2e73b2cd6f01a096acaf2a67683\" pid:4650 exit_status:1 exited_at:{seconds:1757001055 nanos:775848321}" Sep 4 15:50:55.883072 systemd-networkd[1532]: cali214b238ae73: Gained IPv6LL Sep 4 15:50:55.883424 systemd-networkd[1532]: cali4671790e931: Gained IPv6LL Sep 4 15:50:55.938919 systemd-networkd[1532]: vxlan.calico: Link UP Sep 4 15:50:55.938923 systemd-networkd[1532]: vxlan.calico: Gained carrier Sep 4 15:50:56.033579 containerd[1644]: time="2025-09-04T15:50:56.033554676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-574qn,Uid:b3ebd254-9b08-41b4-bd75-06a3a045be62,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:56.037335 containerd[1644]: time="2025-09-04T15:50:56.037315223Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d4584f895-4pckt,Uid:ec3c0434-a74a-4cd8-aad6-25051cea8b8a,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:50:56.197580 kubelet[2944]: I0904 15:50:56.197481 2944 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f699e642-1b05-433e-bd2f-a664a37b4e72" path="/var/lib/kubelet/pods/f699e642-1b05-433e-bd2f-a664a37b4e72/volumes" Sep 4 15:50:56.312228 systemd-networkd[1532]: cali21f372e52fe: Link UP Sep 4 15:50:56.313063 systemd-networkd[1532]: cali21f372e52fe: Gained carrier Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.189 [INFO][4754] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0 calico-apiserver-d4584f895- calico-apiserver ec3c0434-a74a-4cd8-aad6-25051cea8b8a 834 0 2025-09-04 15:50:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:d4584f895 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-d4584f895-4pckt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali21f372e52fe [] [] }} ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-4pckt" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--4pckt-" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.192 [INFO][4754] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-4pckt" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.249 [INFO][4782] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" HandleID="k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Workload="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.249 [INFO][4782] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" HandleID="k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Workload="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efd0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-d4584f895-4pckt", "timestamp":"2025-09-04 15:50:56.249013832 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.249 [INFO][4782] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.249 [INFO][4782] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.249 [INFO][4782] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.260 [INFO][4782] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.268 [INFO][4782] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.279 [INFO][4782] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.280 [INFO][4782] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.281 [INFO][4782] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.282 [INFO][4782] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.284 [INFO][4782] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.290 [INFO][4782] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.299 [INFO][4782] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.299 [INFO][4782] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" host="localhost" Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.299 [INFO][4782] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:50:56.341846 containerd[1644]: 2025-09-04 15:50:56.299 [INFO][4782] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" HandleID="k8s-pod-network.d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Workload="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" Sep 4 15:50:56.343523 containerd[1644]: 2025-09-04 15:50:56.307 [INFO][4754] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-4pckt" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0", GenerateName:"calico-apiserver-d4584f895-", Namespace:"calico-apiserver", SelfLink:"", UID:"ec3c0434-a74a-4cd8-aad6-25051cea8b8a", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d4584f895", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-d4584f895-4pckt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali21f372e52fe", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:56.343523 containerd[1644]: 2025-09-04 15:50:56.308 [INFO][4754] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-4pckt" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" Sep 4 15:50:56.343523 containerd[1644]: 2025-09-04 15:50:56.308 [INFO][4754] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali21f372e52fe ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-4pckt" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" Sep 4 15:50:56.343523 containerd[1644]: 2025-09-04 15:50:56.316 [INFO][4754] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-4pckt" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" Sep 4 15:50:56.343523 containerd[1644]: 2025-09-04 15:50:56.316 [INFO][4754] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-4pckt" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0", GenerateName:"calico-apiserver-d4584f895-", Namespace:"calico-apiserver", SelfLink:"", UID:"ec3c0434-a74a-4cd8-aad6-25051cea8b8a", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d4584f895", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c", Pod:"calico-apiserver-d4584f895-4pckt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali21f372e52fe", MAC:"1a:09:8a:ea:22:7f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:56.343523 containerd[1644]: 2025-09-04 15:50:56.337 [INFO][4754] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" Namespace="calico-apiserver" Pod="calico-apiserver-d4584f895-4pckt" WorkloadEndpoint="localhost-k8s-calico--apiserver--d4584f895--4pckt-eth0" Sep 4 15:50:56.412167 systemd-networkd[1532]: calif20f2e12f6d: Link UP Sep 4 15:50:56.413803 systemd-networkd[1532]: calif20f2e12f6d: Gained carrier Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.189 [INFO][4753] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--574qn-eth0 goldmane-7988f88666- calico-system b3ebd254-9b08-41b4-bd75-06a3a045be62 826 0 2025-09-04 15:50:14 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-574qn eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calif20f2e12f6d [] [] }} ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Namespace="calico-system" Pod="goldmane-7988f88666-574qn" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--574qn-" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.191 [INFO][4753] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Namespace="calico-system" Pod="goldmane-7988f88666-574qn" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--574qn-eth0" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.260 [INFO][4780] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" HandleID="k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Workload="localhost-k8s-goldmane--7988f88666--574qn-eth0" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.260 [INFO][4780] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" HandleID="k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Workload="localhost-k8s-goldmane--7988f88666--574qn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad4a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-574qn", "timestamp":"2025-09-04 15:50:56.260064381 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.261 [INFO][4780] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.300 [INFO][4780] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.300 [INFO][4780] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.358 [INFO][4780] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.368 [INFO][4780] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.380 [INFO][4780] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.381 [INFO][4780] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.383 [INFO][4780] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.383 [INFO][4780] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.383 [INFO][4780] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68 Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.390 [INFO][4780] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.399 [INFO][4780] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.399 [INFO][4780] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" host="localhost" Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.399 [INFO][4780] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:50:56.435683 containerd[1644]: 2025-09-04 15:50:56.399 [INFO][4780] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" HandleID="k8s-pod-network.3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Workload="localhost-k8s-goldmane--7988f88666--574qn-eth0" Sep 4 15:50:56.436963 containerd[1644]: 2025-09-04 15:50:56.403 [INFO][4753] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Namespace="calico-system" Pod="goldmane-7988f88666-574qn" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--574qn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--574qn-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"b3ebd254-9b08-41b4-bd75-06a3a045be62", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-574qn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif20f2e12f6d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:56.436963 containerd[1644]: 2025-09-04 15:50:56.403 [INFO][4753] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Namespace="calico-system" Pod="goldmane-7988f88666-574qn" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--574qn-eth0" Sep 4 15:50:56.436963 containerd[1644]: 2025-09-04 15:50:56.404 [INFO][4753] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif20f2e12f6d ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Namespace="calico-system" Pod="goldmane-7988f88666-574qn" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--574qn-eth0" Sep 4 15:50:56.436963 containerd[1644]: 2025-09-04 15:50:56.419 [INFO][4753] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Namespace="calico-system" Pod="goldmane-7988f88666-574qn" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--574qn-eth0" Sep 4 15:50:56.436963 containerd[1644]: 2025-09-04 15:50:56.419 [INFO][4753] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Namespace="calico-system" Pod="goldmane-7988f88666-574qn" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--574qn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--574qn-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"b3ebd254-9b08-41b4-bd75-06a3a045be62", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68", Pod:"goldmane-7988f88666-574qn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif20f2e12f6d", MAC:"4e:1b:7f:48:b4:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:56.436963 containerd[1644]: 2025-09-04 15:50:56.431 [INFO][4753] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" Namespace="calico-system" Pod="goldmane-7988f88666-574qn" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--574qn-eth0" Sep 4 15:50:56.486207 systemd-networkd[1532]: cali599bcf0997b: Gained IPv6LL Sep 4 15:50:56.639867 containerd[1644]: time="2025-09-04T15:50:56.639834551Z" level=info msg="connecting to shim d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c" address="unix:///run/containerd/s/77d3c289708fb0744a532bc5bb63d8bc4bc6f89e47ef13d06d99c1858db5a128" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:56.681256 systemd-networkd[1532]: cali9f8cc8fdcf5: Gained IPv6LL Sep 4 15:50:56.835819 containerd[1644]: time="2025-09-04T15:50:56.835272674Z" level=info msg="connecting to shim 3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68" address="unix:///run/containerd/s/4665888e3354078392af50e964d7d0b491affbbcce4a30d1ed2ebc28e57ae6d4" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:56.886258 systemd[1]: Started cri-containerd-3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68.scope - libcontainer container 3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68. Sep 4 15:50:56.887290 systemd[1]: Started cri-containerd-d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c.scope - libcontainer container d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c. Sep 4 15:50:56.907310 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:50:56.925982 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:50:57.062297 systemd-networkd[1532]: cali27068496d31: Gained IPv6LL Sep 4 15:50:57.350620 containerd[1644]: time="2025-09-04T15:50:57.350581406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-574qn,Uid:b3ebd254-9b08-41b4-bd75-06a3a045be62,Namespace:calico-system,Attempt:0,} returns sandbox id \"3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68\"" Sep 4 15:50:57.364601 containerd[1644]: time="2025-09-04T15:50:57.364529893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d4584f895-4pckt,Uid:ec3c0434-a74a-4cd8-aad6-25051cea8b8a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c\"" Sep 4 15:50:57.382677 systemd-networkd[1532]: vxlan.calico: Gained IPv6LL Sep 4 15:50:57.598908 kubelet[2944]: I0904 15:50:57.598658 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-l6fg8" podStartSLOduration=54.592445902 podStartE2EDuration="54.592445902s" podCreationTimestamp="2025-09-04 15:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:50:57.59234307 +0000 UTC m=+59.656722073" watchObservedRunningTime="2025-09-04 15:50:57.592445902 +0000 UTC m=+59.656824900" Sep 4 15:50:57.958309 systemd-networkd[1532]: cali21f372e52fe: Gained IPv6LL Sep 4 15:50:58.214221 systemd-networkd[1532]: calif20f2e12f6d: Gained IPv6LL Sep 4 15:50:59.026467 containerd[1644]: time="2025-09-04T15:50:59.026434407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x2d7v,Uid:983a0613-198a-42b5-9c35-64b2a2869e6c,Namespace:calico-system,Attempt:0,}" Sep 4 15:50:59.555751 systemd-networkd[1532]: calic1307f26eb3: Link UP Sep 4 15:50:59.557685 systemd-networkd[1532]: calic1307f26eb3: Gained carrier Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.443 [INFO][4949] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--x2d7v-eth0 csi-node-driver- calico-system 983a0613-198a-42b5-9c35-64b2a2869e6c 685 0 2025-09-04 15:50:15 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-x2d7v eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic1307f26eb3 [] [] }} ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Namespace="calico-system" Pod="csi-node-driver-x2d7v" WorkloadEndpoint="localhost-k8s-csi--node--driver--x2d7v-" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.449 [INFO][4949] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Namespace="calico-system" Pod="csi-node-driver-x2d7v" WorkloadEndpoint="localhost-k8s-csi--node--driver--x2d7v-eth0" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.496 [INFO][4960] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" HandleID="k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Workload="localhost-k8s-csi--node--driver--x2d7v-eth0" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.496 [INFO][4960] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" HandleID="k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Workload="localhost-k8s-csi--node--driver--x2d7v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f610), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-x2d7v", "timestamp":"2025-09-04 15:50:59.496020617 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.496 [INFO][4960] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.496 [INFO][4960] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.496 [INFO][4960] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.501 [INFO][4960] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.503 [INFO][4960] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.506 [INFO][4960] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.507 [INFO][4960] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.508 [INFO][4960] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.508 [INFO][4960] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.509 [INFO][4960] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.516 [INFO][4960] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.547 [INFO][4960] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.547 [INFO][4960] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" host="localhost" Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.548 [INFO][4960] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:50:59.570283 containerd[1644]: 2025-09-04 15:50:59.548 [INFO][4960] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" HandleID="k8s-pod-network.cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Workload="localhost-k8s-csi--node--driver--x2d7v-eth0" Sep 4 15:50:59.571035 containerd[1644]: 2025-09-04 15:50:59.550 [INFO][4949] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Namespace="calico-system" Pod="csi-node-driver-x2d7v" WorkloadEndpoint="localhost-k8s-csi--node--driver--x2d7v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--x2d7v-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"983a0613-198a-42b5-9c35-64b2a2869e6c", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-x2d7v", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic1307f26eb3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:59.571035 containerd[1644]: 2025-09-04 15:50:59.550 [INFO][4949] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Namespace="calico-system" Pod="csi-node-driver-x2d7v" WorkloadEndpoint="localhost-k8s-csi--node--driver--x2d7v-eth0" Sep 4 15:50:59.571035 containerd[1644]: 2025-09-04 15:50:59.550 [INFO][4949] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic1307f26eb3 ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Namespace="calico-system" Pod="csi-node-driver-x2d7v" WorkloadEndpoint="localhost-k8s-csi--node--driver--x2d7v-eth0" Sep 4 15:50:59.571035 containerd[1644]: 2025-09-04 15:50:59.558 [INFO][4949] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Namespace="calico-system" Pod="csi-node-driver-x2d7v" WorkloadEndpoint="localhost-k8s-csi--node--driver--x2d7v-eth0" Sep 4 15:50:59.571035 containerd[1644]: 2025-09-04 15:50:59.558 [INFO][4949] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Namespace="calico-system" Pod="csi-node-driver-x2d7v" WorkloadEndpoint="localhost-k8s-csi--node--driver--x2d7v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--x2d7v-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"983a0613-198a-42b5-9c35-64b2a2869e6c", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 50, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec", Pod:"csi-node-driver-x2d7v", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic1307f26eb3", MAC:"fa:09:46:b2:d5:cf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:50:59.571035 containerd[1644]: 2025-09-04 15:50:59.567 [INFO][4949] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" Namespace="calico-system" Pod="csi-node-driver-x2d7v" WorkloadEndpoint="localhost-k8s-csi--node--driver--x2d7v-eth0" Sep 4 15:50:59.610563 containerd[1644]: time="2025-09-04T15:50:59.610507962Z" level=info msg="connecting to shim cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec" address="unix:///run/containerd/s/97497b2d5e22a23c7238768cc7060358cd1520c3b7c9660d0e99b09f5a6bd08b" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:50:59.630218 systemd[1]: Started cri-containerd-cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec.scope - libcontainer container cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec. Sep 4 15:50:59.638346 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:50:59.647650 containerd[1644]: time="2025-09-04T15:50:59.647602783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x2d7v,Uid:983a0613-198a-42b5-9c35-64b2a2869e6c,Namespace:calico-system,Attempt:0,} returns sandbox id \"cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec\"" Sep 4 15:51:00.287992 containerd[1644]: time="2025-09-04T15:51:00.287646462Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 4 15:51:00.305296 containerd[1644]: time="2025-09-04T15:51:00.304547892Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 5.411374297s" Sep 4 15:51:00.305296 containerd[1644]: time="2025-09-04T15:51:00.304574933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 4 15:51:00.306871 containerd[1644]: time="2025-09-04T15:51:00.306418204Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 4 15:51:00.313250 containerd[1644]: time="2025-09-04T15:51:00.313228558Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:00.313759 containerd[1644]: time="2025-09-04T15:51:00.313745519Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:00.314128 containerd[1644]: time="2025-09-04T15:51:00.314109730Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:00.314984 containerd[1644]: time="2025-09-04T15:51:00.314961435Z" level=info msg="CreateContainer within sandbox \"2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 4 15:51:00.322131 containerd[1644]: time="2025-09-04T15:51:00.319745922Z" level=info msg="Container 9b927fb19dc4cf32aba0343b8665a8a4bae1dd8ba68e7817aedc84d8266456ff: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:51:00.334452 containerd[1644]: time="2025-09-04T15:51:00.334415618Z" level=info msg="CreateContainer within sandbox \"2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"9b927fb19dc4cf32aba0343b8665a8a4bae1dd8ba68e7817aedc84d8266456ff\"" Sep 4 15:51:00.335290 containerd[1644]: time="2025-09-04T15:51:00.335231489Z" level=info msg="StartContainer for \"9b927fb19dc4cf32aba0343b8665a8a4bae1dd8ba68e7817aedc84d8266456ff\"" Sep 4 15:51:00.336089 containerd[1644]: time="2025-09-04T15:51:00.336077138Z" level=info msg="connecting to shim 9b927fb19dc4cf32aba0343b8665a8a4bae1dd8ba68e7817aedc84d8266456ff" address="unix:///run/containerd/s/9e9a45ba2e00da8a60d953b45b8be0ce494795ca784a11ff7b2396e004f6e357" protocol=ttrpc version=3 Sep 4 15:51:00.354287 systemd[1]: Started cri-containerd-9b927fb19dc4cf32aba0343b8665a8a4bae1dd8ba68e7817aedc84d8266456ff.scope - libcontainer container 9b927fb19dc4cf32aba0343b8665a8a4bae1dd8ba68e7817aedc84d8266456ff. Sep 4 15:51:00.394187 containerd[1644]: time="2025-09-04T15:51:00.394165075Z" level=info msg="StartContainer for \"9b927fb19dc4cf32aba0343b8665a8a4bae1dd8ba68e7817aedc84d8266456ff\" returns successfully" Sep 4 15:51:00.710298 systemd-networkd[1532]: calic1307f26eb3: Gained IPv6LL Sep 4 15:51:07.676689 containerd[1644]: time="2025-09-04T15:51:07.676577380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:07.693811 containerd[1644]: time="2025-09-04T15:51:07.693776131Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 4 15:51:07.704609 containerd[1644]: time="2025-09-04T15:51:07.704548233Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:07.716274 containerd[1644]: time="2025-09-04T15:51:07.716231072Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:07.717155 containerd[1644]: time="2025-09-04T15:51:07.716870769Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 7.410431057s" Sep 4 15:51:07.717155 containerd[1644]: time="2025-09-04T15:51:07.716894764Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 4 15:51:07.718006 containerd[1644]: time="2025-09-04T15:51:07.717989913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 15:51:07.759316 containerd[1644]: time="2025-09-04T15:51:07.759283481Z" level=info msg="CreateContainer within sandbox \"ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 4 15:51:08.076658 containerd[1644]: time="2025-09-04T15:51:08.076128771Z" level=info msg="Container 358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:51:08.096996 containerd[1644]: time="2025-09-04T15:51:08.096968160Z" level=info msg="CreateContainer within sandbox \"ab097e7844e4125fc20840ede5639b4c2abd82e6996180eeca5af451d0ccd16b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\"" Sep 4 15:51:08.097888 containerd[1644]: time="2025-09-04T15:51:08.097871620Z" level=info msg="StartContainer for \"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\"" Sep 4 15:51:08.107484 containerd[1644]: time="2025-09-04T15:51:08.098908720Z" level=info msg="connecting to shim 358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec" address="unix:///run/containerd/s/fc7ea7dd1a638dfb704a0555b241a7487966746c3d18932c1c01578e8317b8e6" protocol=ttrpc version=3 Sep 4 15:51:08.116288 systemd[1]: Started cri-containerd-358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec.scope - libcontainer container 358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec. Sep 4 15:51:08.166336 containerd[1644]: time="2025-09-04T15:51:08.166311564Z" level=info msg="StartContainer for \"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\" returns successfully" Sep 4 15:51:08.629373 kubelet[2944]: I0904 15:51:08.629314 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7454479f5c-2sn5z" podStartSLOduration=41.427357656 podStartE2EDuration="53.629300282s" podCreationTimestamp="2025-09-04 15:50:15 +0000 UTC" firstStartedPulling="2025-09-04 15:50:55.515891021 +0000 UTC m=+57.580270014" lastFinishedPulling="2025-09-04 15:51:07.717833641 +0000 UTC m=+69.782212640" observedRunningTime="2025-09-04 15:51:08.625886018 +0000 UTC m=+70.690265027" watchObservedRunningTime="2025-09-04 15:51:08.629300282 +0000 UTC m=+70.693679293" Sep 4 15:51:08.652241 containerd[1644]: time="2025-09-04T15:51:08.652215670Z" level=info msg="TaskExit event in podsandbox handler container_id:\"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\" id:\"5be27f31fabde19b9e5b1e56221e2887001f2fd02e955a44d52b7142be8c5192\" pid:5133 exited_at:{seconds:1757001068 nanos:651971621}" Sep 4 15:51:11.465969 containerd[1644]: time="2025-09-04T15:51:11.465930930Z" level=info msg="TaskExit event in podsandbox handler container_id:\"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\" id:\"0285ce740d8c28aa92a7b93147c165007fd74b75c10921dc369cc851e927b743\" pid:5158 exited_at:{seconds:1757001071 nanos:465716978}" Sep 4 15:51:15.936318 systemd[1]: Started sshd@7-139.178.70.104:22-139.178.89.65:56262.service - OpenSSH per-connection server daemon (139.178.89.65:56262). Sep 4 15:51:16.051802 sshd[5172]: Accepted publickey for core from 139.178.89.65 port 56262 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:16.053597 sshd-session[5172]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:16.058784 systemd-logind[1618]: New session 10 of user core. Sep 4 15:51:16.064424 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 4 15:51:16.648543 sshd[5175]: Connection closed by 139.178.89.65 port 56262 Sep 4 15:51:16.648928 sshd-session[5172]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:16.655266 systemd-logind[1618]: Session 10 logged out. Waiting for processes to exit. Sep 4 15:51:16.656012 systemd[1]: sshd@7-139.178.70.104:22-139.178.89.65:56262.service: Deactivated successfully. Sep 4 15:51:16.660046 systemd[1]: session-10.scope: Deactivated successfully. Sep 4 15:51:16.662253 systemd-logind[1618]: Removed session 10. Sep 4 15:51:21.659567 systemd[1]: Started sshd@8-139.178.70.104:22-139.178.89.65:38348.service - OpenSSH per-connection server daemon (139.178.89.65:38348). Sep 4 15:51:22.588370 sshd[5200]: Accepted publickey for core from 139.178.89.65 port 38348 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:22.599170 containerd[1644]: time="2025-09-04T15:51:22.599138322Z" level=info msg="TaskExit event in podsandbox handler container_id:\"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\" id:\"2af9e8f0bfd72713070bf9013eb99feb6e34f78764211b0f05ebd3753e59ddd7\" pid:5216 exited_at:{seconds:1757001082 nanos:571510914}" Sep 4 15:51:22.599811 sshd-session[5200]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:22.616202 systemd-logind[1618]: New session 11 of user core. Sep 4 15:51:22.622208 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 4 15:51:23.266240 containerd[1644]: time="2025-09-04T15:51:23.266172980Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:23.269105 containerd[1644]: time="2025-09-04T15:51:23.268853235Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 4 15:51:23.326532 containerd[1644]: time="2025-09-04T15:51:23.326490931Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:23.391219 containerd[1644]: time="2025-09-04T15:51:23.391171180Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:23.398794 containerd[1644]: time="2025-09-04T15:51:23.391491496Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 15.672424874s" Sep 4 15:51:23.398794 containerd[1644]: time="2025-09-04T15:51:23.391512897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 15:51:23.473194 containerd[1644]: time="2025-09-04T15:51:23.472926815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 4 15:51:23.478222 containerd[1644]: time="2025-09-04T15:51:23.478135715Z" level=info msg="CreateContainer within sandbox \"73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 15:51:23.535268 containerd[1644]: time="2025-09-04T15:51:23.534900269Z" level=info msg="Container e10992471d2ec973a56b2e10e11c0b61063499892afc7dbb35d77e29bb07c272: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:51:23.593692 containerd[1644]: time="2025-09-04T15:51:23.593576414Z" level=info msg="CreateContainer within sandbox \"73aed508c51d2df9f53f56e2a3e2e65561ceef814de0776edfa0fa134cf813f7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e10992471d2ec973a56b2e10e11c0b61063499892afc7dbb35d77e29bb07c272\"" Sep 4 15:51:23.596147 containerd[1644]: time="2025-09-04T15:51:23.594543713Z" level=info msg="StartContainer for \"e10992471d2ec973a56b2e10e11c0b61063499892afc7dbb35d77e29bb07c272\"" Sep 4 15:51:23.598729 containerd[1644]: time="2025-09-04T15:51:23.598440302Z" level=info msg="connecting to shim e10992471d2ec973a56b2e10e11c0b61063499892afc7dbb35d77e29bb07c272" address="unix:///run/containerd/s/8075984f84908e73433aeb27b8a94b9a4252ff78e8fad21bcf58250f15c2cfcc" protocol=ttrpc version=3 Sep 4 15:51:23.635350 systemd[1]: Started cri-containerd-e10992471d2ec973a56b2e10e11c0b61063499892afc7dbb35d77e29bb07c272.scope - libcontainer container e10992471d2ec973a56b2e10e11c0b61063499892afc7dbb35d77e29bb07c272. Sep 4 15:51:23.703977 containerd[1644]: time="2025-09-04T15:51:23.703944499Z" level=info msg="StartContainer for \"e10992471d2ec973a56b2e10e11c0b61063499892afc7dbb35d77e29bb07c272\" returns successfully" Sep 4 15:51:23.938580 sshd[5226]: Connection closed by 139.178.89.65 port 38348 Sep 4 15:51:23.940685 sshd-session[5200]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:23.945535 systemd[1]: sshd@8-139.178.70.104:22-139.178.89.65:38348.service: Deactivated successfully. Sep 4 15:51:23.948523 systemd[1]: session-11.scope: Deactivated successfully. Sep 4 15:51:23.949904 systemd-logind[1618]: Session 11 logged out. Waiting for processes to exit. Sep 4 15:51:23.952467 systemd-logind[1618]: Removed session 11. Sep 4 15:51:24.314625 containerd[1644]: time="2025-09-04T15:51:24.314380539Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\" id:\"4da3147e1db488e8ed5f6cf57243a8c82ea5fe766643d6c259ead3c6a56ea683\" pid:5251 exited_at:{seconds:1757001084 nanos:313776817}" Sep 4 15:51:24.794333 kubelet[2944]: I0904 15:51:24.786310 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-d4584f895-xqpkf" podStartSLOduration=44.085474771 podStartE2EDuration="1m11.780280199s" podCreationTimestamp="2025-09-04 15:50:13 +0000 UTC" firstStartedPulling="2025-09-04 15:50:55.759325299 +0000 UTC m=+57.823704292" lastFinishedPulling="2025-09-04 15:51:23.454130726 +0000 UTC m=+85.518509720" observedRunningTime="2025-09-04 15:51:24.722860388 +0000 UTC m=+86.787239399" watchObservedRunningTime="2025-09-04 15:51:24.780280199 +0000 UTC m=+86.844659196" Sep 4 15:51:28.949156 systemd[1]: Started sshd@9-139.178.70.104:22-139.178.89.65:38364.service - OpenSSH per-connection server daemon (139.178.89.65:38364). Sep 4 15:51:29.103331 sshd[5312]: Accepted publickey for core from 139.178.89.65 port 38364 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:29.104610 sshd-session[5312]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:29.108158 systemd-logind[1618]: New session 12 of user core. Sep 4 15:51:29.115259 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 4 15:51:30.610844 sshd[5317]: Connection closed by 139.178.89.65 port 38364 Sep 4 15:51:30.614155 sshd-session[5312]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:30.629334 systemd-logind[1618]: Session 12 logged out. Waiting for processes to exit. Sep 4 15:51:30.629583 systemd[1]: sshd@9-139.178.70.104:22-139.178.89.65:38364.service: Deactivated successfully. Sep 4 15:51:30.630754 systemd[1]: session-12.scope: Deactivated successfully. Sep 4 15:51:30.631749 systemd-logind[1618]: Removed session 12. Sep 4 15:51:33.833709 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1362065358.mount: Deactivated successfully. Sep 4 15:51:35.485267 containerd[1644]: time="2025-09-04T15:51:35.485179050Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:35.500324 containerd[1644]: time="2025-09-04T15:51:35.500281140Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 4 15:51:35.542388 containerd[1644]: time="2025-09-04T15:51:35.542350314Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:35.546521 containerd[1644]: time="2025-09-04T15:51:35.546495716Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:35.551710 containerd[1644]: time="2025-09-04T15:51:35.551580813Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 12.07594654s" Sep 4 15:51:35.551710 containerd[1644]: time="2025-09-04T15:51:35.551608765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 4 15:51:35.623097 systemd[1]: Started sshd@10-139.178.70.104:22-139.178.89.65:38650.service - OpenSSH per-connection server daemon (139.178.89.65:38650). Sep 4 15:51:35.716071 containerd[1644]: time="2025-09-04T15:51:35.715213742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 15:51:35.748357 containerd[1644]: time="2025-09-04T15:51:35.748292076Z" level=info msg="CreateContainer within sandbox \"3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 4 15:51:35.756639 sshd[5346]: Accepted publickey for core from 139.178.89.65 port 38650 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:35.758476 sshd-session[5346]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:35.769567 systemd-logind[1618]: New session 13 of user core. Sep 4 15:51:35.773634 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 4 15:51:35.802086 containerd[1644]: time="2025-09-04T15:51:35.802062299Z" level=info msg="Container 6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:51:36.066342 containerd[1644]: time="2025-09-04T15:51:36.066048538Z" level=info msg="CreateContainer within sandbox \"3e1a5ff15e177d4bcaaf48f89f089daca72ed0229ad7e9b8ae35336b0afbbd68\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56\"" Sep 4 15:51:36.068433 containerd[1644]: time="2025-09-04T15:51:36.067987260Z" level=info msg="StartContainer for \"6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56\"" Sep 4 15:51:36.076033 containerd[1644]: time="2025-09-04T15:51:36.075753002Z" level=info msg="connecting to shim 6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56" address="unix:///run/containerd/s/4665888e3354078392af50e964d7d0b491affbbcce4a30d1ed2ebc28e57ae6d4" protocol=ttrpc version=3 Sep 4 15:51:36.216599 systemd[1]: Started cri-containerd-6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56.scope - libcontainer container 6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56. Sep 4 15:51:36.299077 containerd[1644]: time="2025-09-04T15:51:36.299007777Z" level=info msg="StartContainer for \"6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56\" returns successfully" Sep 4 15:51:36.451136 containerd[1644]: time="2025-09-04T15:51:36.451051569Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:36.455217 containerd[1644]: time="2025-09-04T15:51:36.455201016Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 4 15:51:36.456384 containerd[1644]: time="2025-09-04T15:51:36.456338418Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 741.098837ms" Sep 4 15:51:36.456384 containerd[1644]: time="2025-09-04T15:51:36.456357488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 15:51:36.461672 containerd[1644]: time="2025-09-04T15:51:36.457106911Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 4 15:51:36.465600 containerd[1644]: time="2025-09-04T15:51:36.465165465Z" level=info msg="CreateContainer within sandbox \"d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 15:51:36.474011 containerd[1644]: time="2025-09-04T15:51:36.473987828Z" level=info msg="Container 7f705b54b17d06d2750772cd2f7f6a26ba46db97702656c0f0f1e0caa8a9236e: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:51:36.479093 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount303298184.mount: Deactivated successfully. Sep 4 15:51:36.482822 containerd[1644]: time="2025-09-04T15:51:36.482800311Z" level=info msg="CreateContainer within sandbox \"d5b9fea45c31ce926aaa062b71e2a3d37dc60b69506ce6adc6fbd106e749942c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7f705b54b17d06d2750772cd2f7f6a26ba46db97702656c0f0f1e0caa8a9236e\"" Sep 4 15:51:36.487294 containerd[1644]: time="2025-09-04T15:51:36.487267590Z" level=info msg="StartContainer for \"7f705b54b17d06d2750772cd2f7f6a26ba46db97702656c0f0f1e0caa8a9236e\"" Sep 4 15:51:36.488306 containerd[1644]: time="2025-09-04T15:51:36.488287263Z" level=info msg="connecting to shim 7f705b54b17d06d2750772cd2f7f6a26ba46db97702656c0f0f1e0caa8a9236e" address="unix:///run/containerd/s/77d3c289708fb0744a532bc5bb63d8bc4bc6f89e47ef13d06d99c1858db5a128" protocol=ttrpc version=3 Sep 4 15:51:36.511204 systemd[1]: Started cri-containerd-7f705b54b17d06d2750772cd2f7f6a26ba46db97702656c0f0f1e0caa8a9236e.scope - libcontainer container 7f705b54b17d06d2750772cd2f7f6a26ba46db97702656c0f0f1e0caa8a9236e. Sep 4 15:51:36.581246 containerd[1644]: time="2025-09-04T15:51:36.579258950Z" level=info msg="StartContainer for \"7f705b54b17d06d2750772cd2f7f6a26ba46db97702656c0f0f1e0caa8a9236e\" returns successfully" Sep 4 15:51:36.596634 sshd[5349]: Connection closed by 139.178.89.65 port 38650 Sep 4 15:51:36.597453 sshd-session[5346]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:36.605502 systemd[1]: sshd@10-139.178.70.104:22-139.178.89.65:38650.service: Deactivated successfully. Sep 4 15:51:36.607460 systemd[1]: session-13.scope: Deactivated successfully. Sep 4 15:51:36.610854 systemd-logind[1618]: Session 13 logged out. Waiting for processes to exit. Sep 4 15:51:36.614896 systemd[1]: Started sshd@11-139.178.70.104:22-139.178.89.65:38654.service - OpenSSH per-connection server daemon (139.178.89.65:38654). Sep 4 15:51:36.615794 systemd-logind[1618]: Removed session 13. Sep 4 15:51:36.691295 sshd[5425]: Accepted publickey for core from 139.178.89.65 port 38654 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:36.692415 sshd-session[5425]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:36.695768 systemd-logind[1618]: New session 14 of user core. Sep 4 15:51:36.702197 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 4 15:51:37.003604 sshd[5432]: Connection closed by 139.178.89.65 port 38654 Sep 4 15:51:37.006521 sshd-session[5425]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:37.016277 systemd[1]: Started sshd@12-139.178.70.104:22-139.178.89.65:38662.service - OpenSSH per-connection server daemon (139.178.89.65:38662). Sep 4 15:51:37.020868 systemd[1]: sshd@11-139.178.70.104:22-139.178.89.65:38654.service: Deactivated successfully. Sep 4 15:51:37.024352 systemd[1]: session-14.scope: Deactivated successfully. Sep 4 15:51:37.028077 systemd-logind[1618]: Session 14 logged out. Waiting for processes to exit. Sep 4 15:51:37.031958 systemd-logind[1618]: Removed session 14. Sep 4 15:51:37.114932 sshd[5444]: Accepted publickey for core from 139.178.89.65 port 38662 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:37.116560 sshd-session[5444]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:37.123796 systemd-logind[1618]: New session 15 of user core. Sep 4 15:51:37.129430 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 4 15:51:38.302088 sshd[5450]: Connection closed by 139.178.89.65 port 38662 Sep 4 15:51:38.306871 sshd-session[5444]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:38.312832 systemd[1]: sshd@12-139.178.70.104:22-139.178.89.65:38662.service: Deactivated successfully. Sep 4 15:51:38.313161 systemd-logind[1618]: Session 15 logged out. Waiting for processes to exit. Sep 4 15:51:38.314814 systemd[1]: session-15.scope: Deactivated successfully. Sep 4 15:51:38.319695 systemd-logind[1618]: Removed session 15. Sep 4 15:51:38.866435 kubelet[2944]: I0904 15:51:38.857373 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-574qn" podStartSLOduration=46.670147232 podStartE2EDuration="1m24.824596155s" podCreationTimestamp="2025-09-04 15:50:14 +0000 UTC" firstStartedPulling="2025-09-04 15:50:57.537292741 +0000 UTC m=+59.601671733" lastFinishedPulling="2025-09-04 15:51:35.691741658 +0000 UTC m=+97.756120656" observedRunningTime="2025-09-04 15:51:38.578094075 +0000 UTC m=+100.642473069" watchObservedRunningTime="2025-09-04 15:51:38.824596155 +0000 UTC m=+100.888975152" Sep 4 15:51:38.952054 containerd[1644]: time="2025-09-04T15:51:38.951956764Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56\" id:\"d1c0a0f8ed19aa05ddc81d1fdcf86c27ff5468159785b925d15ddba7999ec107\" pid:5472 exit_status:1 exited_at:{seconds:1757001098 nanos:926811482}" Sep 4 15:51:38.960387 kubelet[2944]: I0904 15:51:38.959973 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-d4584f895-4pckt" podStartSLOduration=47.051509382 podStartE2EDuration="1m25.959960783s" podCreationTimestamp="2025-09-04 15:50:13 +0000 UTC" firstStartedPulling="2025-09-04 15:50:57.548539349 +0000 UTC m=+59.612918341" lastFinishedPulling="2025-09-04 15:51:36.456990749 +0000 UTC m=+98.521369742" observedRunningTime="2025-09-04 15:51:38.95978013 +0000 UTC m=+101.024159133" watchObservedRunningTime="2025-09-04 15:51:38.959960783 +0000 UTC m=+101.024339780" Sep 4 15:51:39.086136 containerd[1644]: time="2025-09-04T15:51:39.086101941Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56\" id:\"67487f516575a92b9f4d8693165f64e8d26eff5999f462a4cac805cf82601a26\" pid:5504 exit_status:1 exited_at:{seconds:1757001099 nanos:85923378}" Sep 4 15:51:41.499382 containerd[1644]: time="2025-09-04T15:51:41.499340418Z" level=info msg="TaskExit event in podsandbox handler container_id:\"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\" id:\"1adb8a594d5f6b7ad127e17885cea8e330e651cfe5d483b84b33c06101c03255\" pid:5550 exited_at:{seconds:1757001101 nanos:499022303}" Sep 4 15:51:41.523513 containerd[1644]: time="2025-09-04T15:51:41.523481414Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56\" id:\"68adb1ce7b5aaea719786065e7b171ee8b9a241b64440c28dfb6296d4816afbc\" pid:5568 exit_status:1 exited_at:{seconds:1757001101 nanos:522594094}" Sep 4 15:51:43.314450 systemd[1]: Started sshd@13-139.178.70.104:22-139.178.89.65:40084.service - OpenSSH per-connection server daemon (139.178.89.65:40084). Sep 4 15:51:43.454147 sshd[5582]: Accepted publickey for core from 139.178.89.65 port 40084 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:43.456300 sshd-session[5582]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:43.459723 systemd-logind[1618]: New session 16 of user core. Sep 4 15:51:43.466370 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 4 15:51:44.458521 sshd[5585]: Connection closed by 139.178.89.65 port 40084 Sep 4 15:51:44.458456 sshd-session[5582]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:44.461761 systemd[1]: sshd@13-139.178.70.104:22-139.178.89.65:40084.service: Deactivated successfully. Sep 4 15:51:44.463227 systemd[1]: session-16.scope: Deactivated successfully. Sep 4 15:51:44.463940 systemd-logind[1618]: Session 16 logged out. Waiting for processes to exit. Sep 4 15:51:44.465762 systemd-logind[1618]: Removed session 16. Sep 4 15:51:45.843758 containerd[1644]: time="2025-09-04T15:51:45.843717254Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:45.847244 containerd[1644]: time="2025-09-04T15:51:45.847229346Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 4 15:51:45.853616 containerd[1644]: time="2025-09-04T15:51:45.852579673Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:45.854493 containerd[1644]: time="2025-09-04T15:51:45.853968659Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:45.854631 containerd[1644]: time="2025-09-04T15:51:45.854611091Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 9.397441861s" Sep 4 15:51:45.854684 containerd[1644]: time="2025-09-04T15:51:45.854676083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 4 15:51:45.892276 containerd[1644]: time="2025-09-04T15:51:45.892253151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 4 15:51:45.957185 containerd[1644]: time="2025-09-04T15:51:45.951895115Z" level=info msg="CreateContainer within sandbox \"cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 4 15:51:45.982338 containerd[1644]: time="2025-09-04T15:51:45.981751969Z" level=info msg="Container b670e3e1631d2cb823fb57edbaae1c9a1472a8b9ce960d596d162816779d90f4: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:51:45.987472 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4155187455.mount: Deactivated successfully. Sep 4 15:51:46.008725 containerd[1644]: time="2025-09-04T15:51:46.008698150Z" level=info msg="CreateContainer within sandbox \"cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"b670e3e1631d2cb823fb57edbaae1c9a1472a8b9ce960d596d162816779d90f4\"" Sep 4 15:51:46.011142 containerd[1644]: time="2025-09-04T15:51:46.010601968Z" level=info msg="StartContainer for \"b670e3e1631d2cb823fb57edbaae1c9a1472a8b9ce960d596d162816779d90f4\"" Sep 4 15:51:46.012707 containerd[1644]: time="2025-09-04T15:51:46.012669029Z" level=info msg="connecting to shim b670e3e1631d2cb823fb57edbaae1c9a1472a8b9ce960d596d162816779d90f4" address="unix:///run/containerd/s/97497b2d5e22a23c7238768cc7060358cd1520c3b7c9660d0e99b09f5a6bd08b" protocol=ttrpc version=3 Sep 4 15:51:46.029736 systemd[1]: Started cri-containerd-b670e3e1631d2cb823fb57edbaae1c9a1472a8b9ce960d596d162816779d90f4.scope - libcontainer container b670e3e1631d2cb823fb57edbaae1c9a1472a8b9ce960d596d162816779d90f4. Sep 4 15:51:46.077547 containerd[1644]: time="2025-09-04T15:51:46.077519840Z" level=info msg="StartContainer for \"b670e3e1631d2cb823fb57edbaae1c9a1472a8b9ce960d596d162816779d90f4\" returns successfully" Sep 4 15:51:49.421170 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3400263149.mount: Deactivated successfully. Sep 4 15:51:49.453880 containerd[1644]: time="2025-09-04T15:51:49.453723311Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:49.454638 containerd[1644]: time="2025-09-04T15:51:49.454339938Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 4 15:51:49.454638 containerd[1644]: time="2025-09-04T15:51:49.454560311Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:49.455655 containerd[1644]: time="2025-09-04T15:51:49.455634101Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:49.456036 containerd[1644]: time="2025-09-04T15:51:49.456019893Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.563622879s" Sep 4 15:51:49.456068 containerd[1644]: time="2025-09-04T15:51:49.456038148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 4 15:51:49.457248 containerd[1644]: time="2025-09-04T15:51:49.456978931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 4 15:51:49.457635 containerd[1644]: time="2025-09-04T15:51:49.457622978Z" level=info msg="CreateContainer within sandbox \"2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 4 15:51:49.471265 containerd[1644]: time="2025-09-04T15:51:49.471245160Z" level=info msg="Container 356d3a07cebfe977c2bcd9e6bc1791047a36622813e3fc1f1a7c91dbf52f89e1: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:51:49.473886 systemd[1]: Started sshd@14-139.178.70.104:22-139.178.89.65:40092.service - OpenSSH per-connection server daemon (139.178.89.65:40092). Sep 4 15:51:49.487481 containerd[1644]: time="2025-09-04T15:51:49.487463845Z" level=info msg="CreateContainer within sandbox \"2d7fad9ddad72a61de9ffcf40da04b33ae35ff3c02c9456f3fd10755e713851b\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"356d3a07cebfe977c2bcd9e6bc1791047a36622813e3fc1f1a7c91dbf52f89e1\"" Sep 4 15:51:49.491457 containerd[1644]: time="2025-09-04T15:51:49.491446347Z" level=info msg="StartContainer for \"356d3a07cebfe977c2bcd9e6bc1791047a36622813e3fc1f1a7c91dbf52f89e1\"" Sep 4 15:51:49.492406 containerd[1644]: time="2025-09-04T15:51:49.492392891Z" level=info msg="connecting to shim 356d3a07cebfe977c2bcd9e6bc1791047a36622813e3fc1f1a7c91dbf52f89e1" address="unix:///run/containerd/s/9e9a45ba2e00da8a60d953b45b8be0ce494795ca784a11ff7b2396e004f6e357" protocol=ttrpc version=3 Sep 4 15:51:49.565806 systemd[1]: Started cri-containerd-356d3a07cebfe977c2bcd9e6bc1791047a36622813e3fc1f1a7c91dbf52f89e1.scope - libcontainer container 356d3a07cebfe977c2bcd9e6bc1791047a36622813e3fc1f1a7c91dbf52f89e1. Sep 4 15:51:49.618467 sshd[5644]: Accepted publickey for core from 139.178.89.65 port 40092 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:49.621037 sshd-session[5644]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:49.628464 systemd-logind[1618]: New session 17 of user core. Sep 4 15:51:49.632246 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 4 15:51:49.719545 containerd[1644]: time="2025-09-04T15:51:49.719463882Z" level=info msg="StartContainer for \"356d3a07cebfe977c2bcd9e6bc1791047a36622813e3fc1f1a7c91dbf52f89e1\" returns successfully" Sep 4 15:51:50.687282 sshd[5672]: Connection closed by 139.178.89.65 port 40092 Sep 4 15:51:50.719865 sshd-session[5644]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:50.727414 systemd[1]: sshd@14-139.178.70.104:22-139.178.89.65:40092.service: Deactivated successfully. Sep 4 15:51:50.729763 systemd[1]: session-17.scope: Deactivated successfully. Sep 4 15:51:50.731952 systemd-logind[1618]: Session 17 logged out. Waiting for processes to exit. Sep 4 15:51:50.733000 systemd-logind[1618]: Removed session 17. Sep 4 15:51:51.655278 containerd[1644]: time="2025-09-04T15:51:51.655242297Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:51.660054 containerd[1644]: time="2025-09-04T15:51:51.660031075Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 4 15:51:51.674832 containerd[1644]: time="2025-09-04T15:51:51.674694124Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:51.682068 containerd[1644]: time="2025-09-04T15:51:51.682014860Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:51:51.682737 containerd[1644]: time="2025-09-04T15:51:51.682559746Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.225561312s" Sep 4 15:51:51.682737 containerd[1644]: time="2025-09-04T15:51:51.682615469Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 4 15:51:51.999849 containerd[1644]: time="2025-09-04T15:51:51.999778082Z" level=info msg="CreateContainer within sandbox \"cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 4 15:51:52.323160 containerd[1644]: time="2025-09-04T15:51:52.322908560Z" level=info msg="Container 7d71a17a8915a44df8d65a99ff499449ce0ca5ee823bab1bf300cc5776a89d70: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:51:52.341307 containerd[1644]: time="2025-09-04T15:51:52.341212147Z" level=info msg="CreateContainer within sandbox \"cc7243737c7f8e8c30825d063e35473f79e42f9dbaafab459135ca782b3cd1ec\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7d71a17a8915a44df8d65a99ff499449ce0ca5ee823bab1bf300cc5776a89d70\"" Sep 4 15:51:52.343750 containerd[1644]: time="2025-09-04T15:51:52.342652130Z" level=info msg="StartContainer for \"7d71a17a8915a44df8d65a99ff499449ce0ca5ee823bab1bf300cc5776a89d70\"" Sep 4 15:51:52.343750 containerd[1644]: time="2025-09-04T15:51:52.343580457Z" level=info msg="connecting to shim 7d71a17a8915a44df8d65a99ff499449ce0ca5ee823bab1bf300cc5776a89d70" address="unix:///run/containerd/s/97497b2d5e22a23c7238768cc7060358cd1520c3b7c9660d0e99b09f5a6bd08b" protocol=ttrpc version=3 Sep 4 15:51:52.374371 systemd[1]: Started cri-containerd-7d71a17a8915a44df8d65a99ff499449ce0ca5ee823bab1bf300cc5776a89d70.scope - libcontainer container 7d71a17a8915a44df8d65a99ff499449ce0ca5ee823bab1bf300cc5776a89d70. Sep 4 15:51:52.437944 containerd[1644]: time="2025-09-04T15:51:52.437924413Z" level=info msg="StartContainer for \"7d71a17a8915a44df8d65a99ff499449ce0ca5ee823bab1bf300cc5776a89d70\" returns successfully" Sep 4 15:51:53.050990 kubelet[2944]: I0904 15:51:53.042206 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5ff586f7f7-s9x8z" podStartSLOduration=4.428143787 podStartE2EDuration="58.993112761s" podCreationTimestamp="2025-09-04 15:50:54 +0000 UTC" firstStartedPulling="2025-09-04 15:50:54.891758864 +0000 UTC m=+56.956137856" lastFinishedPulling="2025-09-04 15:51:49.456727837 +0000 UTC m=+111.521106830" observedRunningTime="2025-09-04 15:51:50.794684579 +0000 UTC m=+112.859063581" watchObservedRunningTime="2025-09-04 15:51:52.993112761 +0000 UTC m=+115.057491758" Sep 4 15:51:53.655103 kubelet[2944]: I0904 15:51:53.651410 2944 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 4 15:51:53.673868 kubelet[2944]: I0904 15:51:53.673672 2944 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 4 15:51:54.257994 containerd[1644]: time="2025-09-04T15:51:54.257963197Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\" id:\"82bb4431a93b87f5a8c86390081a4c39b35b66cc63364d2da2c87b30e9c63f14\" pid:5739 exited_at:{seconds:1757001114 nanos:202716466}" Sep 4 15:51:55.727167 systemd[1]: Started sshd@15-139.178.70.104:22-139.178.89.65:55270.service - OpenSSH per-connection server daemon (139.178.89.65:55270). Sep 4 15:51:55.852832 sshd[5753]: Accepted publickey for core from 139.178.89.65 port 55270 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:51:55.855825 sshd-session[5753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:51:55.862155 systemd-logind[1618]: New session 18 of user core. Sep 4 15:51:55.867235 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 4 15:51:56.988882 containerd[1644]: time="2025-09-04T15:51:56.988859820Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56\" id:\"0ccc0d638034adc3f9b3efd8edc91f8abbeb4ed58e3bd3da062063377618df4c\" pid:5775 exited_at:{seconds:1757001116 nanos:988669088}" Sep 4 15:51:57.112685 sshd[5756]: Connection closed by 139.178.89.65 port 55270 Sep 4 15:51:57.111451 sshd-session[5753]: pam_unix(sshd:session): session closed for user core Sep 4 15:51:57.113664 systemd-logind[1618]: Session 18 logged out. Waiting for processes to exit. Sep 4 15:51:57.114251 systemd[1]: sshd@15-139.178.70.104:22-139.178.89.65:55270.service: Deactivated successfully. Sep 4 15:51:57.116059 systemd[1]: session-18.scope: Deactivated successfully. Sep 4 15:51:57.117909 systemd-logind[1618]: Removed session 18. Sep 4 15:52:02.121293 systemd[1]: Started sshd@16-139.178.70.104:22-139.178.89.65:43538.service - OpenSSH per-connection server daemon (139.178.89.65:43538). Sep 4 15:52:02.702817 sshd[5794]: Accepted publickey for core from 139.178.89.65 port 43538 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:52:02.730586 sshd-session[5794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:52:02.760193 systemd-logind[1618]: New session 19 of user core. Sep 4 15:52:02.768216 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 4 15:52:04.793174 sshd[5797]: Connection closed by 139.178.89.65 port 43538 Sep 4 15:52:04.797902 sshd-session[5794]: pam_unix(sshd:session): session closed for user core Sep 4 15:52:04.805795 systemd[1]: Started sshd@17-139.178.70.104:22-139.178.89.65:43550.service - OpenSSH per-connection server daemon (139.178.89.65:43550). Sep 4 15:52:04.810535 systemd[1]: sshd@16-139.178.70.104:22-139.178.89.65:43538.service: Deactivated successfully. Sep 4 15:52:04.817899 systemd[1]: session-19.scope: Deactivated successfully. Sep 4 15:52:04.826782 systemd-logind[1618]: Session 19 logged out. Waiting for processes to exit. Sep 4 15:52:04.829477 systemd-logind[1618]: Removed session 19. Sep 4 15:52:04.876877 sshd[5808]: Accepted publickey for core from 139.178.89.65 port 43550 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:52:04.877849 sshd-session[5808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:52:04.881074 systemd-logind[1618]: New session 20 of user core. Sep 4 15:52:04.890474 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 4 15:52:05.661709 sshd[5814]: Connection closed by 139.178.89.65 port 43550 Sep 4 15:52:05.663287 sshd-session[5808]: pam_unix(sshd:session): session closed for user core Sep 4 15:52:05.674400 systemd[1]: sshd@17-139.178.70.104:22-139.178.89.65:43550.service: Deactivated successfully. Sep 4 15:52:05.676493 systemd[1]: session-20.scope: Deactivated successfully. Sep 4 15:52:05.677514 systemd-logind[1618]: Session 20 logged out. Waiting for processes to exit. Sep 4 15:52:05.681472 systemd[1]: Started sshd@18-139.178.70.104:22-139.178.89.65:43558.service - OpenSSH per-connection server daemon (139.178.89.65:43558). Sep 4 15:52:05.687523 systemd-logind[1618]: Removed session 20. Sep 4 15:52:05.809920 sshd[5824]: Accepted publickey for core from 139.178.89.65 port 43558 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:52:05.810872 sshd-session[5824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:52:05.813851 systemd-logind[1618]: New session 21 of user core. Sep 4 15:52:05.819204 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 4 15:52:11.892358 sshd[5827]: Connection closed by 139.178.89.65 port 43558 Sep 4 15:52:11.956202 systemd[1]: sshd@18-139.178.70.104:22-139.178.89.65:43558.service: Deactivated successfully. Sep 4 15:52:11.890693 sshd-session[5824]: pam_unix(sshd:session): session closed for user core Sep 4 15:52:11.957683 systemd[1]: session-21.scope: Deactivated successfully. Sep 4 15:52:11.957965 systemd[1]: session-21.scope: Consumed 429ms CPU time, 78.1M memory peak. Sep 4 15:52:11.959778 systemd-logind[1618]: Session 21 logged out. Waiting for processes to exit. Sep 4 15:52:11.974500 systemd[1]: Started sshd@19-139.178.70.104:22-139.178.89.65:40210.service - OpenSSH per-connection server daemon (139.178.89.65:40210). Sep 4 15:52:11.976343 systemd-logind[1618]: Removed session 21. Sep 4 15:52:12.184588 sshd[5868]: Accepted publickey for core from 139.178.89.65 port 40210 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:52:12.189838 sshd-session[5868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:52:12.209222 systemd-logind[1618]: New session 22 of user core. Sep 4 15:52:12.216473 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 4 15:52:15.008040 containerd[1644]: time="2025-09-04T15:52:15.007989079Z" level=info msg="TaskExit event in podsandbox handler container_id:\"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\" id:\"a34de0c9fe030078901acffeb5462bb6d6d440ce8d2bc88a834cd723e58283aa\" pid:5903 exited_at:{seconds:1757001135 nanos:3234501}" Sep 4 15:52:15.136469 kubelet[2944]: E0904 15:52:15.122003 2944 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="7.042s" Sep 4 15:52:15.964707 containerd[1644]: time="2025-09-04T15:52:15.964302340Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a67a92bd6827038adced3f710c32cd48b0afdc16af28a971ec6a0278c302e56\" id:\"60b3b3f7b16e72c24ec7697776fe4e12f640766e3f3502a7b76e62a0649dcc6d\" pid:5904 exited_at:{seconds:1757001135 nanos:962599459}" Sep 4 15:52:16.304260 kubelet[2944]: I0904 15:52:16.301237 2944 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-x2d7v" podStartSLOduration=69.050010266 podStartE2EDuration="2m1.280023186s" podCreationTimestamp="2025-09-04 15:50:15 +0000 UTC" firstStartedPulling="2025-09-04 15:50:59.653282521 +0000 UTC m=+61.717661514" lastFinishedPulling="2025-09-04 15:51:51.883295441 +0000 UTC m=+113.947674434" observedRunningTime="2025-09-04 15:51:53.051154359 +0000 UTC m=+115.115533352" watchObservedRunningTime="2025-09-04 15:52:16.280023186 +0000 UTC m=+138.344402184" Sep 4 15:52:16.954374 sshd[5871]: Connection closed by 139.178.89.65 port 40210 Sep 4 15:52:16.965279 sshd-session[5868]: pam_unix(sshd:session): session closed for user core Sep 4 15:52:17.015733 systemd[1]: sshd@19-139.178.70.104:22-139.178.89.65:40210.service: Deactivated successfully. Sep 4 15:52:17.018490 systemd[1]: session-22.scope: Deactivated successfully. Sep 4 15:52:17.019150 systemd[1]: session-22.scope: Consumed 678ms CPU time, 65.4M memory peak. Sep 4 15:52:17.021105 systemd-logind[1618]: Session 22 logged out. Waiting for processes to exit. Sep 4 15:52:17.025995 systemd[1]: Started sshd@20-139.178.70.104:22-139.178.89.65:40224.service - OpenSSH per-connection server daemon (139.178.89.65:40224). Sep 4 15:52:17.027369 systemd-logind[1618]: Removed session 22. Sep 4 15:52:17.159127 sshd[5950]: Accepted publickey for core from 139.178.89.65 port 40224 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:52:17.163570 sshd-session[5950]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:52:17.172775 systemd-logind[1618]: New session 23 of user core. Sep 4 15:52:17.176731 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 4 15:52:17.543235 sshd[5953]: Connection closed by 139.178.89.65 port 40224 Sep 4 15:52:17.543560 sshd-session[5950]: pam_unix(sshd:session): session closed for user core Sep 4 15:52:17.547427 systemd[1]: sshd@20-139.178.70.104:22-139.178.89.65:40224.service: Deactivated successfully. Sep 4 15:52:17.549910 systemd[1]: session-23.scope: Deactivated successfully. Sep 4 15:52:17.551311 systemd-logind[1618]: Session 23 logged out. Waiting for processes to exit. Sep 4 15:52:17.552176 systemd-logind[1618]: Removed session 23. Sep 4 15:52:22.567038 systemd[1]: Started sshd@21-139.178.70.104:22-139.178.89.65:44064.service - OpenSSH per-connection server daemon (139.178.89.65:44064). Sep 4 15:52:22.826131 sshd[5973]: Accepted publickey for core from 139.178.89.65 port 44064 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:52:22.828840 sshd-session[5973]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:52:22.833273 systemd-logind[1618]: New session 24 of user core. Sep 4 15:52:22.839420 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 4 15:52:22.870911 containerd[1644]: time="2025-09-04T15:52:22.870866424Z" level=info msg="TaskExit event in podsandbox handler container_id:\"358255b197b7f03e0db7d74d0612764d8eeca09797367bb0a26ff1cc1bc9d0ec\" id:\"a859ce38bfecd6d7b3548a5be77a4e68b129cc78efee4bd5cdba70af3d13413a\" pid:5981 exited_at:{seconds:1757001142 nanos:860162796}" Sep 4 15:52:24.580676 containerd[1644]: time="2025-09-04T15:52:24.580627797Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b5f35d9717b9ceb20c0cdac6114482b414ac97af9065c684cf33368584bf322\" id:\"01d61df3bc3860798fad67e3fe7f2205ceee87e999c95b015d5ec2d2f9612db6\" pid:6013 exited_at:{seconds:1757001144 nanos:580161238}" Sep 4 15:52:24.789521 sshd[5993]: Connection closed by 139.178.89.65 port 44064 Sep 4 15:52:24.789993 sshd-session[5973]: pam_unix(sshd:session): session closed for user core Sep 4 15:52:24.819621 systemd[1]: sshd@21-139.178.70.104:22-139.178.89.65:44064.service: Deactivated successfully. Sep 4 15:52:24.821634 systemd[1]: session-24.scope: Deactivated successfully. Sep 4 15:52:24.823680 systemd-logind[1618]: Session 24 logged out. Waiting for processes to exit. Sep 4 15:52:24.824916 systemd-logind[1618]: Removed session 24. Sep 4 15:52:29.862018 systemd[1]: Started sshd@22-139.178.70.104:22-139.178.89.65:44072.service - OpenSSH per-connection server daemon (139.178.89.65:44072). Sep 4 15:52:30.408461 sshd[6051]: Accepted publickey for core from 139.178.89.65 port 44072 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:52:30.458481 sshd-session[6051]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:52:30.477756 systemd-logind[1618]: New session 25 of user core. Sep 4 15:52:30.484261 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 4 15:52:32.881824 sshd[6054]: Connection closed by 139.178.89.65 port 44072 Sep 4 15:52:32.882370 sshd-session[6051]: pam_unix(sshd:session): session closed for user core Sep 4 15:52:32.886171 systemd[1]: sshd@22-139.178.70.104:22-139.178.89.65:44072.service: Deactivated successfully. Sep 4 15:52:32.888418 systemd[1]: session-25.scope: Deactivated successfully. Sep 4 15:52:32.890908 systemd-logind[1618]: Session 25 logged out. Waiting for processes to exit. Sep 4 15:52:32.891903 systemd-logind[1618]: Removed session 25. Sep 4 15:52:37.891651 systemd[1]: Started sshd@23-139.178.70.104:22-139.178.89.65:52158.service - OpenSSH per-connection server daemon (139.178.89.65:52158). Sep 4 15:52:37.991042 sshd[6071]: Accepted publickey for core from 139.178.89.65 port 52158 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:52:37.992005 sshd-session[6071]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:52:37.996200 systemd-logind[1618]: New session 26 of user core. Sep 4 15:52:38.001654 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 4 15:52:38.469679 sshd[6074]: Connection closed by 139.178.89.65 port 52158 Sep 4 15:52:38.471259 sshd-session[6071]: pam_unix(sshd:session): session closed for user core Sep 4 15:52:38.475474 systemd[1]: sshd@23-139.178.70.104:22-139.178.89.65:52158.service: Deactivated successfully. Sep 4 15:52:38.476969 systemd[1]: session-26.scope: Deactivated successfully. Sep 4 15:52:38.477519 systemd-logind[1618]: Session 26 logged out. Waiting for processes to exit. Sep 4 15:52:38.478435 systemd-logind[1618]: Removed session 26.