Sep 4 15:42:28.723038 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Sep 4 13:44:59 -00 2025 Sep 4 15:42:28.723054 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:42:28.723061 kernel: Disabled fast string operations Sep 4 15:42:28.723065 kernel: BIOS-provided physical RAM map: Sep 4 15:42:28.723069 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Sep 4 15:42:28.723073 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Sep 4 15:42:28.723079 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Sep 4 15:42:28.723083 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Sep 4 15:42:28.723088 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Sep 4 15:42:28.723092 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Sep 4 15:42:28.723096 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Sep 4 15:42:28.723100 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Sep 4 15:42:28.723104 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Sep 4 15:42:28.723109 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 4 15:42:28.723115 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Sep 4 15:42:28.723120 kernel: NX (Execute Disable) protection: active Sep 4 15:42:28.723124 kernel: APIC: Static calls initialized Sep 4 15:42:28.723129 kernel: SMBIOS 2.7 present. Sep 4 15:42:28.723134 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Sep 4 15:42:28.723139 kernel: DMI: Memory slots populated: 1/128 Sep 4 15:42:28.723145 kernel: vmware: hypercall mode: 0x00 Sep 4 15:42:28.723150 kernel: Hypervisor detected: VMware Sep 4 15:42:28.723155 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Sep 4 15:42:28.723160 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Sep 4 15:42:28.723164 kernel: vmware: using clock offset of 4108065265 ns Sep 4 15:42:28.723169 kernel: tsc: Detected 3408.000 MHz processor Sep 4 15:42:28.723175 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 4 15:42:28.723180 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 4 15:42:28.723185 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Sep 4 15:42:28.723189 kernel: total RAM covered: 3072M Sep 4 15:42:28.723195 kernel: Found optimal setting for mtrr clean up Sep 4 15:42:28.723203 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Sep 4 15:42:28.723208 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Sep 4 15:42:28.723213 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 4 15:42:28.723218 kernel: Using GB pages for direct mapping Sep 4 15:42:28.723223 kernel: ACPI: Early table checksum verification disabled Sep 4 15:42:28.723228 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Sep 4 15:42:28.723233 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Sep 4 15:42:28.723237 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Sep 4 15:42:28.723244 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Sep 4 15:42:28.723250 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 4 15:42:28.723256 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 4 15:42:28.723261 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Sep 4 15:42:28.723266 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Sep 4 15:42:28.723271 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Sep 4 15:42:28.723277 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Sep 4 15:42:28.723282 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Sep 4 15:42:28.723288 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Sep 4 15:42:28.723293 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Sep 4 15:42:28.723298 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Sep 4 15:42:28.723303 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 4 15:42:28.723308 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 4 15:42:28.723313 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Sep 4 15:42:28.723318 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Sep 4 15:42:28.723324 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Sep 4 15:42:28.723330 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Sep 4 15:42:28.723335 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Sep 4 15:42:28.723340 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Sep 4 15:42:28.723345 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 4 15:42:28.723350 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 4 15:42:28.723355 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Sep 4 15:42:28.723360 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Sep 4 15:42:28.723365 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Sep 4 15:42:28.723385 kernel: Zone ranges: Sep 4 15:42:28.723391 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 4 15:42:28.723396 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Sep 4 15:42:28.723401 kernel: Normal empty Sep 4 15:42:28.723406 kernel: Device empty Sep 4 15:42:28.723411 kernel: Movable zone start for each node Sep 4 15:42:28.723416 kernel: Early memory node ranges Sep 4 15:42:28.723421 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Sep 4 15:42:28.723426 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Sep 4 15:42:28.723431 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Sep 4 15:42:28.723438 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Sep 4 15:42:28.723443 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 15:42:28.723448 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Sep 4 15:42:28.723453 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Sep 4 15:42:28.723458 kernel: ACPI: PM-Timer IO Port: 0x1008 Sep 4 15:42:28.723463 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Sep 4 15:42:28.723469 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 4 15:42:28.723474 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 4 15:42:28.723479 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 4 15:42:28.723485 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 4 15:42:28.723490 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 4 15:42:28.723495 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 4 15:42:28.723500 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 4 15:42:28.723505 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 4 15:42:28.723510 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 4 15:42:28.723515 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 4 15:42:28.723520 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 4 15:42:28.723525 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 4 15:42:28.723530 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 4 15:42:28.723536 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 4 15:42:28.723541 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 4 15:42:28.723546 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 4 15:42:28.723551 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Sep 4 15:42:28.723556 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Sep 4 15:42:28.723561 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Sep 4 15:42:28.723566 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Sep 4 15:42:28.723571 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Sep 4 15:42:28.723576 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Sep 4 15:42:28.723582 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Sep 4 15:42:28.723587 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Sep 4 15:42:28.723592 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Sep 4 15:42:28.723597 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Sep 4 15:42:28.723602 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Sep 4 15:42:28.723607 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Sep 4 15:42:28.723612 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Sep 4 15:42:28.723617 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Sep 4 15:42:28.723622 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Sep 4 15:42:28.723627 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Sep 4 15:42:28.723633 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Sep 4 15:42:28.723639 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Sep 4 15:42:28.723644 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Sep 4 15:42:28.723649 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Sep 4 15:42:28.723654 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Sep 4 15:42:28.723659 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Sep 4 15:42:28.723665 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Sep 4 15:42:28.723674 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Sep 4 15:42:28.723679 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Sep 4 15:42:28.723684 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Sep 4 15:42:28.723690 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Sep 4 15:42:28.723696 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Sep 4 15:42:28.723701 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Sep 4 15:42:28.723707 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Sep 4 15:42:28.723712 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Sep 4 15:42:28.723718 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Sep 4 15:42:28.723723 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Sep 4 15:42:28.723728 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Sep 4 15:42:28.723734 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Sep 4 15:42:28.723740 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Sep 4 15:42:28.723745 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Sep 4 15:42:28.723751 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Sep 4 15:42:28.723756 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Sep 4 15:42:28.723761 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Sep 4 15:42:28.723766 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Sep 4 15:42:28.723772 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Sep 4 15:42:28.723777 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Sep 4 15:42:28.723783 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Sep 4 15:42:28.723789 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Sep 4 15:42:28.723794 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Sep 4 15:42:28.723800 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Sep 4 15:42:28.723805 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Sep 4 15:42:28.723810 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Sep 4 15:42:28.723816 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Sep 4 15:42:28.723821 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Sep 4 15:42:28.723826 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Sep 4 15:42:28.723831 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Sep 4 15:42:28.723837 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Sep 4 15:42:28.723843 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Sep 4 15:42:28.723849 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Sep 4 15:42:28.723854 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Sep 4 15:42:28.723859 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Sep 4 15:42:28.723865 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Sep 4 15:42:28.723870 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Sep 4 15:42:28.723876 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Sep 4 15:42:28.723881 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Sep 4 15:42:28.723886 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Sep 4 15:42:28.723892 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Sep 4 15:42:28.723898 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Sep 4 15:42:28.723903 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Sep 4 15:42:28.723909 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Sep 4 15:42:28.723914 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Sep 4 15:42:28.723919 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Sep 4 15:42:28.723924 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Sep 4 15:42:28.723930 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Sep 4 15:42:28.723935 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Sep 4 15:42:28.723941 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Sep 4 15:42:28.723947 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Sep 4 15:42:28.723952 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Sep 4 15:42:28.723958 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Sep 4 15:42:28.723963 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Sep 4 15:42:28.724011 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Sep 4 15:42:28.724018 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Sep 4 15:42:28.724024 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Sep 4 15:42:28.724029 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Sep 4 15:42:28.724035 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Sep 4 15:42:28.724040 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Sep 4 15:42:28.724048 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Sep 4 15:42:28.724053 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Sep 4 15:42:28.724058 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Sep 4 15:42:28.724064 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Sep 4 15:42:28.724069 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Sep 4 15:42:28.724075 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Sep 4 15:42:28.724080 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Sep 4 15:42:28.724085 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Sep 4 15:42:28.724091 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Sep 4 15:42:28.724096 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Sep 4 15:42:28.724103 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Sep 4 15:42:28.724108 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Sep 4 15:42:28.724114 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Sep 4 15:42:28.724119 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Sep 4 15:42:28.724124 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Sep 4 15:42:28.724130 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Sep 4 15:42:28.724135 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Sep 4 15:42:28.724141 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Sep 4 15:42:28.724146 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Sep 4 15:42:28.724151 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Sep 4 15:42:28.724158 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Sep 4 15:42:28.724164 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Sep 4 15:42:28.724169 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Sep 4 15:42:28.724174 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Sep 4 15:42:28.724180 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Sep 4 15:42:28.724185 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Sep 4 15:42:28.724190 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Sep 4 15:42:28.724196 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Sep 4 15:42:28.724201 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Sep 4 15:42:28.724207 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Sep 4 15:42:28.724213 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 4 15:42:28.724219 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Sep 4 15:42:28.724224 kernel: TSC deadline timer available Sep 4 15:42:28.724230 kernel: CPU topo: Max. logical packages: 128 Sep 4 15:42:28.724235 kernel: CPU topo: Max. logical dies: 128 Sep 4 15:42:28.724240 kernel: CPU topo: Max. dies per package: 1 Sep 4 15:42:28.724246 kernel: CPU topo: Max. threads per core: 1 Sep 4 15:42:28.724251 kernel: CPU topo: Num. cores per package: 1 Sep 4 15:42:28.724257 kernel: CPU topo: Num. threads per package: 1 Sep 4 15:42:28.724263 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Sep 4 15:42:28.724269 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Sep 4 15:42:28.724274 kernel: Booting paravirtualized kernel on VMware hypervisor Sep 4 15:42:28.724280 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 4 15:42:28.724285 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Sep 4 15:42:28.724291 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 4 15:42:28.724296 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 4 15:42:28.724302 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Sep 4 15:42:28.724307 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Sep 4 15:42:28.724314 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Sep 4 15:42:28.724319 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Sep 4 15:42:28.724325 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Sep 4 15:42:28.724330 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Sep 4 15:42:28.724335 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Sep 4 15:42:28.724341 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Sep 4 15:42:28.724346 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Sep 4 15:42:28.724351 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Sep 4 15:42:28.724357 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Sep 4 15:42:28.724363 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Sep 4 15:42:28.724369 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Sep 4 15:42:28.725393 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Sep 4 15:42:28.725399 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Sep 4 15:42:28.725404 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Sep 4 15:42:28.725410 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:42:28.725416 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 4 15:42:28.725422 kernel: random: crng init done Sep 4 15:42:28.725429 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Sep 4 15:42:28.725435 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Sep 4 15:42:28.725440 kernel: printk: log_buf_len min size: 262144 bytes Sep 4 15:42:28.725446 kernel: printk: log_buf_len: 1048576 bytes Sep 4 15:42:28.725451 kernel: printk: early log buf free: 245592(93%) Sep 4 15:42:28.725457 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 15:42:28.725462 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 4 15:42:28.725468 kernel: Fallback order for Node 0: 0 Sep 4 15:42:28.725473 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Sep 4 15:42:28.725480 kernel: Policy zone: DMA32 Sep 4 15:42:28.725486 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 4 15:42:28.725491 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Sep 4 15:42:28.725497 kernel: ftrace: allocating 40102 entries in 157 pages Sep 4 15:42:28.725502 kernel: ftrace: allocated 157 pages with 5 groups Sep 4 15:42:28.725508 kernel: Dynamic Preempt: voluntary Sep 4 15:42:28.725513 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 4 15:42:28.725519 kernel: rcu: RCU event tracing is enabled. Sep 4 15:42:28.725525 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Sep 4 15:42:28.725531 kernel: Trampoline variant of Tasks RCU enabled. Sep 4 15:42:28.725537 kernel: Rude variant of Tasks RCU enabled. Sep 4 15:42:28.725542 kernel: Tracing variant of Tasks RCU enabled. Sep 4 15:42:28.725547 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 4 15:42:28.725553 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Sep 4 15:42:28.725558 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 4 15:42:28.725564 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 4 15:42:28.725570 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 4 15:42:28.725575 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Sep 4 15:42:28.725582 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Sep 4 15:42:28.725588 kernel: Console: colour VGA+ 80x25 Sep 4 15:42:28.725593 kernel: printk: legacy console [tty0] enabled Sep 4 15:42:28.725598 kernel: printk: legacy console [ttyS0] enabled Sep 4 15:42:28.725604 kernel: ACPI: Core revision 20240827 Sep 4 15:42:28.725610 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Sep 4 15:42:28.725615 kernel: APIC: Switch to symmetric I/O mode setup Sep 4 15:42:28.725621 kernel: x2apic enabled Sep 4 15:42:28.725626 kernel: APIC: Switched APIC routing to: physical x2apic Sep 4 15:42:28.725632 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 4 15:42:28.725639 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 4 15:42:28.725644 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Sep 4 15:42:28.725650 kernel: Disabled fast string operations Sep 4 15:42:28.725655 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 4 15:42:28.725661 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 4 15:42:28.725666 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 4 15:42:28.725672 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 4 15:42:28.725677 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 4 15:42:28.725684 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 4 15:42:28.725689 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 4 15:42:28.725695 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 4 15:42:28.725701 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 4 15:42:28.725706 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 4 15:42:28.725712 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 4 15:42:28.725717 kernel: GDS: Unknown: Dependent on hypervisor status Sep 4 15:42:28.725723 kernel: active return thunk: its_return_thunk Sep 4 15:42:28.725728 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 4 15:42:28.725735 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 4 15:42:28.725740 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 4 15:42:28.725746 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 4 15:42:28.725751 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 4 15:42:28.725757 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 4 15:42:28.725762 kernel: Freeing SMP alternatives memory: 32K Sep 4 15:42:28.725768 kernel: pid_max: default: 131072 minimum: 1024 Sep 4 15:42:28.725773 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 4 15:42:28.725779 kernel: landlock: Up and running. Sep 4 15:42:28.725785 kernel: SELinux: Initializing. Sep 4 15:42:28.725791 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 4 15:42:28.725797 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 4 15:42:28.725802 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 4 15:42:28.725808 kernel: Performance Events: Skylake events, core PMU driver. Sep 4 15:42:28.725813 kernel: core: CPUID marked event: 'cpu cycles' unavailable Sep 4 15:42:28.725819 kernel: core: CPUID marked event: 'instructions' unavailable Sep 4 15:42:28.725824 kernel: core: CPUID marked event: 'bus cycles' unavailable Sep 4 15:42:28.725830 kernel: core: CPUID marked event: 'cache references' unavailable Sep 4 15:42:28.725836 kernel: core: CPUID marked event: 'cache misses' unavailable Sep 4 15:42:28.725841 kernel: core: CPUID marked event: 'branch instructions' unavailable Sep 4 15:42:28.725847 kernel: core: CPUID marked event: 'branch misses' unavailable Sep 4 15:42:28.725852 kernel: ... version: 1 Sep 4 15:42:28.725858 kernel: ... bit width: 48 Sep 4 15:42:28.725863 kernel: ... generic registers: 4 Sep 4 15:42:28.725869 kernel: ... value mask: 0000ffffffffffff Sep 4 15:42:28.725874 kernel: ... max period: 000000007fffffff Sep 4 15:42:28.725880 kernel: ... fixed-purpose events: 0 Sep 4 15:42:28.725886 kernel: ... event mask: 000000000000000f Sep 4 15:42:28.725892 kernel: signal: max sigframe size: 1776 Sep 4 15:42:28.725897 kernel: rcu: Hierarchical SRCU implementation. Sep 4 15:42:28.725903 kernel: rcu: Max phase no-delay instances is 400. Sep 4 15:42:28.725908 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Sep 4 15:42:28.725914 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 4 15:42:28.725919 kernel: smp: Bringing up secondary CPUs ... Sep 4 15:42:28.725925 kernel: smpboot: x86: Booting SMP configuration: Sep 4 15:42:28.725930 kernel: .... node #0, CPUs: #1 Sep 4 15:42:28.725937 kernel: Disabled fast string operations Sep 4 15:42:28.725942 kernel: smp: Brought up 1 node, 2 CPUs Sep 4 15:42:28.725948 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Sep 4 15:42:28.725953 kernel: Memory: 1924260K/2096628K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 160996K reserved, 0K cma-reserved) Sep 4 15:42:28.725959 kernel: devtmpfs: initialized Sep 4 15:42:28.725964 kernel: x86/mm: Memory block size: 128MB Sep 4 15:42:28.725970 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Sep 4 15:42:28.725976 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 4 15:42:28.725981 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Sep 4 15:42:28.725988 kernel: pinctrl core: initialized pinctrl subsystem Sep 4 15:42:28.725993 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 4 15:42:28.725999 kernel: audit: initializing netlink subsys (disabled) Sep 4 15:42:28.726004 kernel: audit: type=2000 audit(1757000546.289:1): state=initialized audit_enabled=0 res=1 Sep 4 15:42:28.726010 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 4 15:42:28.726015 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 4 15:42:28.726021 kernel: cpuidle: using governor menu Sep 4 15:42:28.726026 kernel: Simple Boot Flag at 0x36 set to 0x80 Sep 4 15:42:28.726032 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 4 15:42:28.726038 kernel: dca service started, version 1.12.1 Sep 4 15:42:28.726051 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Sep 4 15:42:28.726058 kernel: PCI: Using configuration type 1 for base access Sep 4 15:42:28.726064 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 4 15:42:28.726070 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 4 15:42:28.726075 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 4 15:42:28.726081 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 4 15:42:28.726087 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 4 15:42:28.726093 kernel: ACPI: Added _OSI(Module Device) Sep 4 15:42:28.726099 kernel: ACPI: Added _OSI(Processor Device) Sep 4 15:42:28.726105 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 4 15:42:28.726111 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 4 15:42:28.726117 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 4 15:42:28.726122 kernel: ACPI: Interpreter enabled Sep 4 15:42:28.726128 kernel: ACPI: PM: (supports S0 S1 S5) Sep 4 15:42:28.726134 kernel: ACPI: Using IOAPIC for interrupt routing Sep 4 15:42:28.726140 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 4 15:42:28.726146 kernel: PCI: Using E820 reservations for host bridge windows Sep 4 15:42:28.726152 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Sep 4 15:42:28.726158 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Sep 4 15:42:28.726247 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 4 15:42:28.726301 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Sep 4 15:42:28.726351 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Sep 4 15:42:28.726360 kernel: PCI host bridge to bus 0000:00 Sep 4 15:42:28.726430 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 4 15:42:28.726480 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Sep 4 15:42:28.726524 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 4 15:42:28.726568 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 4 15:42:28.726611 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Sep 4 15:42:28.726654 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Sep 4 15:42:28.726715 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Sep 4 15:42:28.726775 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Sep 4 15:42:28.726828 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 15:42:28.726886 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 4 15:42:28.726940 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Sep 4 15:42:28.726992 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Sep 4 15:42:28.727043 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Sep 4 15:42:28.727093 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Sep 4 15:42:28.727143 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Sep 4 15:42:28.727193 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Sep 4 15:42:28.727249 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 4 15:42:28.727300 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Sep 4 15:42:28.727352 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Sep 4 15:42:28.729438 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Sep 4 15:42:28.729497 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Sep 4 15:42:28.729550 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Sep 4 15:42:28.729605 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Sep 4 15:42:28.729656 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Sep 4 15:42:28.729710 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Sep 4 15:42:28.729759 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Sep 4 15:42:28.729809 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Sep 4 15:42:28.729863 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 4 15:42:28.729936 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Sep 4 15:42:28.729990 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 4 15:42:28.730040 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 4 15:42:28.730093 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 4 15:42:28.730142 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 4 15:42:28.730198 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.730250 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 4 15:42:28.730299 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 4 15:42:28.730349 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 4 15:42:28.731420 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.731482 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.731534 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 4 15:42:28.731585 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 4 15:42:28.731634 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 4 15:42:28.731685 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 4 15:42:28.731734 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.731788 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.731842 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 4 15:42:28.731892 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 4 15:42:28.731942 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 4 15:42:28.731992 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 4 15:42:28.732042 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.732097 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.732151 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 4 15:42:28.732202 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 4 15:42:28.732257 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 4 15:42:28.732308 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.732370 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.732448 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 4 15:42:28.732499 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 4 15:42:28.732553 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 4 15:42:28.732603 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.732660 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.732710 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 4 15:42:28.732773 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 4 15:42:28.732824 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 4 15:42:28.732874 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.732932 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.732984 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 4 15:42:28.733033 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 4 15:42:28.733082 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 4 15:42:28.733132 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.733185 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.733235 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 4 15:42:28.733288 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 4 15:42:28.733338 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 4 15:42:28.733398 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.733453 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.733504 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 4 15:42:28.733554 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 4 15:42:28.733604 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 4 15:42:28.733653 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.733712 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.733763 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 4 15:42:28.733814 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 4 15:42:28.733864 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 4 15:42:28.733913 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 4 15:42:28.733963 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.734018 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.734072 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 4 15:42:28.734122 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 4 15:42:28.734172 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 4 15:42:28.734222 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 4 15:42:28.734271 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.734327 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.735399 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 4 15:42:28.735459 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 4 15:42:28.735511 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 4 15:42:28.735561 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.735615 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.735667 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 4 15:42:28.735717 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 4 15:42:28.735768 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 4 15:42:28.735820 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.735876 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.735927 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 4 15:42:28.735977 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 4 15:42:28.736026 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 4 15:42:28.736076 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.736129 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.736183 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 4 15:42:28.736233 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 4 15:42:28.736288 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 4 15:42:28.736351 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.736420 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.736477 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 4 15:42:28.736528 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 4 15:42:28.736588 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 4 15:42:28.738912 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.738983 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.739048 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 4 15:42:28.739102 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 4 15:42:28.739152 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 4 15:42:28.739203 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 4 15:42:28.739254 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.739312 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.739363 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 4 15:42:28.739436 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 4 15:42:28.739491 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 4 15:42:28.739541 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 4 15:42:28.739590 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.739646 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.739698 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 4 15:42:28.739748 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 4 15:42:28.739798 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 4 15:42:28.739851 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 4 15:42:28.739901 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.739955 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.740006 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 4 15:42:28.740055 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 4 15:42:28.740105 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 4 15:42:28.740155 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.740214 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.740265 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 4 15:42:28.740315 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 4 15:42:28.740365 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 4 15:42:28.740425 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.740481 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.740532 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 4 15:42:28.740584 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 4 15:42:28.740634 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 4 15:42:28.740684 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.740744 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.740795 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 4 15:42:28.740845 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 4 15:42:28.740895 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 4 15:42:28.740947 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.741004 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.741055 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 4 15:42:28.741105 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 4 15:42:28.741155 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 4 15:42:28.741205 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.741258 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.741309 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 4 15:42:28.741362 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 4 15:42:28.741427 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 4 15:42:28.741478 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 4 15:42:28.741527 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.741582 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.741633 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 4 15:42:28.741684 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 4 15:42:28.741741 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 4 15:42:28.741791 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 4 15:42:28.741841 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.741896 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.741947 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 4 15:42:28.741997 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 4 15:42:28.742047 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 4 15:42:28.742099 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.742153 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.742203 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 4 15:42:28.742253 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 4 15:42:28.742302 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 4 15:42:28.742352 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.742420 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.742475 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 4 15:42:28.742525 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 4 15:42:28.742575 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 4 15:42:28.742625 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.742679 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.742729 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 4 15:42:28.742779 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 4 15:42:28.742831 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 4 15:42:28.742881 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.742946 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.742998 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 4 15:42:28.743047 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 4 15:42:28.743097 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 4 15:42:28.743147 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.743206 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 4 15:42:28.743256 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 4 15:42:28.743307 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 4 15:42:28.743356 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 4 15:42:28.743422 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.743494 kernel: pci_bus 0000:01: extended config space not accessible Sep 4 15:42:28.743550 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 15:42:28.743602 kernel: pci_bus 0000:02: extended config space not accessible Sep 4 15:42:28.743613 kernel: acpiphp: Slot [32] registered Sep 4 15:42:28.743619 kernel: acpiphp: Slot [33] registered Sep 4 15:42:28.743625 kernel: acpiphp: Slot [34] registered Sep 4 15:42:28.743631 kernel: acpiphp: Slot [35] registered Sep 4 15:42:28.743637 kernel: acpiphp: Slot [36] registered Sep 4 15:42:28.743643 kernel: acpiphp: Slot [37] registered Sep 4 15:42:28.743648 kernel: acpiphp: Slot [38] registered Sep 4 15:42:28.743654 kernel: acpiphp: Slot [39] registered Sep 4 15:42:28.743660 kernel: acpiphp: Slot [40] registered Sep 4 15:42:28.743667 kernel: acpiphp: Slot [41] registered Sep 4 15:42:28.743672 kernel: acpiphp: Slot [42] registered Sep 4 15:42:28.743678 kernel: acpiphp: Slot [43] registered Sep 4 15:42:28.743684 kernel: acpiphp: Slot [44] registered Sep 4 15:42:28.743690 kernel: acpiphp: Slot [45] registered Sep 4 15:42:28.743696 kernel: acpiphp: Slot [46] registered Sep 4 15:42:28.743702 kernel: acpiphp: Slot [47] registered Sep 4 15:42:28.743707 kernel: acpiphp: Slot [48] registered Sep 4 15:42:28.743713 kernel: acpiphp: Slot [49] registered Sep 4 15:42:28.743720 kernel: acpiphp: Slot [50] registered Sep 4 15:42:28.743726 kernel: acpiphp: Slot [51] registered Sep 4 15:42:28.743732 kernel: acpiphp: Slot [52] registered Sep 4 15:42:28.743738 kernel: acpiphp: Slot [53] registered Sep 4 15:42:28.743744 kernel: acpiphp: Slot [54] registered Sep 4 15:42:28.743749 kernel: acpiphp: Slot [55] registered Sep 4 15:42:28.743755 kernel: acpiphp: Slot [56] registered Sep 4 15:42:28.743761 kernel: acpiphp: Slot [57] registered Sep 4 15:42:28.743767 kernel: acpiphp: Slot [58] registered Sep 4 15:42:28.743774 kernel: acpiphp: Slot [59] registered Sep 4 15:42:28.743780 kernel: acpiphp: Slot [60] registered Sep 4 15:42:28.743785 kernel: acpiphp: Slot [61] registered Sep 4 15:42:28.743791 kernel: acpiphp: Slot [62] registered Sep 4 15:42:28.743797 kernel: acpiphp: Slot [63] registered Sep 4 15:42:28.743846 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 4 15:42:28.743897 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Sep 4 15:42:28.743947 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Sep 4 15:42:28.743996 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Sep 4 15:42:28.744048 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Sep 4 15:42:28.744098 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Sep 4 15:42:28.744155 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Sep 4 15:42:28.744207 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Sep 4 15:42:28.744257 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Sep 4 15:42:28.744309 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 4 15:42:28.744360 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 4 15:42:28.744425 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 4 15:42:28.744476 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 4 15:42:28.744527 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 4 15:42:28.744578 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 4 15:42:28.744629 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 4 15:42:28.744679 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 4 15:42:28.744742 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 4 15:42:28.744798 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 4 15:42:28.744851 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 4 15:42:28.744908 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Sep 4 15:42:28.744961 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Sep 4 15:42:28.745013 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Sep 4 15:42:28.745065 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Sep 4 15:42:28.745116 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Sep 4 15:42:28.745169 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 4 15:42:28.745221 kernel: pci 0000:0b:00.0: supports D1 D2 Sep 4 15:42:28.745271 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 4 15:42:28.745323 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 4 15:42:28.747330 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 4 15:42:28.747408 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 4 15:42:28.747467 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 4 15:42:28.747521 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 4 15:42:28.747577 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 4 15:42:28.747630 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 4 15:42:28.747683 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 4 15:42:28.747740 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 4 15:42:28.747793 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 4 15:42:28.747845 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 4 15:42:28.747895 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 4 15:42:28.747947 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 4 15:42:28.748000 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 4 15:42:28.748062 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 4 15:42:28.748115 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 4 15:42:28.748166 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 4 15:42:28.748217 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 4 15:42:28.748268 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 4 15:42:28.748320 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 4 15:42:28.748382 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 4 15:42:28.748444 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 4 15:42:28.748496 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 4 15:42:28.748547 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 4 15:42:28.748598 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 4 15:42:28.748607 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Sep 4 15:42:28.748613 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Sep 4 15:42:28.748621 kernel: ACPI: PCI: Interrupt link LNKB disabled Sep 4 15:42:28.748628 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 4 15:42:28.748633 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Sep 4 15:42:28.748639 kernel: iommu: Default domain type: Translated Sep 4 15:42:28.748646 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 4 15:42:28.748652 kernel: PCI: Using ACPI for IRQ routing Sep 4 15:42:28.748658 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 4 15:42:28.748664 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Sep 4 15:42:28.748670 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Sep 4 15:42:28.748722 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Sep 4 15:42:28.748779 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Sep 4 15:42:28.748830 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 4 15:42:28.748839 kernel: vgaarb: loaded Sep 4 15:42:28.748845 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Sep 4 15:42:28.748851 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Sep 4 15:42:28.748857 kernel: clocksource: Switched to clocksource tsc-early Sep 4 15:42:28.748863 kernel: VFS: Disk quotas dquot_6.6.0 Sep 4 15:42:28.748869 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 4 15:42:28.748877 kernel: pnp: PnP ACPI init Sep 4 15:42:28.748930 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Sep 4 15:42:28.748978 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Sep 4 15:42:28.749023 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Sep 4 15:42:28.749073 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Sep 4 15:42:28.749124 kernel: pnp 00:06: [dma 2] Sep 4 15:42:28.749174 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Sep 4 15:42:28.749222 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Sep 4 15:42:28.749269 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Sep 4 15:42:28.749277 kernel: pnp: PnP ACPI: found 8 devices Sep 4 15:42:28.749283 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 4 15:42:28.749290 kernel: NET: Registered PF_INET protocol family Sep 4 15:42:28.749296 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 4 15:42:28.749302 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 4 15:42:28.749308 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 4 15:42:28.749316 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 4 15:42:28.749322 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 4 15:42:28.749328 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 4 15:42:28.749334 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 4 15:42:28.749340 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 4 15:42:28.749345 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 4 15:42:28.749352 kernel: NET: Registered PF_XDP protocol family Sep 4 15:42:28.749436 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Sep 4 15:42:28.749492 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 4 15:42:28.749545 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 4 15:42:28.749596 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 4 15:42:28.749649 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 4 15:42:28.749701 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Sep 4 15:42:28.749752 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Sep 4 15:42:28.749804 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Sep 4 15:42:28.749855 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Sep 4 15:42:28.749909 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Sep 4 15:42:28.749959 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Sep 4 15:42:28.750010 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Sep 4 15:42:28.750061 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Sep 4 15:42:28.750112 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Sep 4 15:42:28.750163 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Sep 4 15:42:28.750213 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Sep 4 15:42:28.750264 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Sep 4 15:42:28.750317 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Sep 4 15:42:28.750368 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Sep 4 15:42:28.750444 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Sep 4 15:42:28.750496 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Sep 4 15:42:28.750548 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Sep 4 15:42:28.750599 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Sep 4 15:42:28.750649 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Sep 4 15:42:28.750703 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Sep 4 15:42:28.750758 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.750808 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.750859 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.750908 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.750959 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751010 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751060 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751112 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751163 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751213 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751263 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751313 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751363 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751423 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751474 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751526 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751576 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751625 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751675 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751726 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751776 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751826 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751876 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.751929 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.751979 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752030 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752079 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752129 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752179 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752229 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752279 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752331 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752396 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752448 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752499 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752549 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752599 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752649 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752701 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752756 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752807 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752856 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.752906 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.752955 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753005 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753053 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753103 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753154 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753204 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753254 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753303 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753352 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753421 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753476 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753525 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753575 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753628 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753682 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753745 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753799 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753850 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.753901 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.753951 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.754357 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.754431 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.754483 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.754538 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.754588 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.754639 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.754690 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.754746 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.754795 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.754846 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.754896 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.754949 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.754999 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.755050 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.755099 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.755149 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.755200 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.755251 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.755301 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.755355 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 4 15:42:28.755414 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 4 15:42:28.755482 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 15:42:28.755534 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Sep 4 15:42:28.755585 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 4 15:42:28.755634 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 4 15:42:28.755683 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 4 15:42:28.755738 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Sep 4 15:42:28.755792 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 4 15:42:28.755843 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 4 15:42:28.755893 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 4 15:42:28.755942 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Sep 4 15:42:28.755994 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 4 15:42:28.756044 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 4 15:42:28.756094 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 4 15:42:28.756144 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 4 15:42:28.756196 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 4 15:42:28.756249 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 4 15:42:28.756300 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 4 15:42:28.756349 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 4 15:42:28.756465 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 4 15:42:28.756519 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 4 15:42:28.756570 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 4 15:42:28.756631 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 4 15:42:28.756683 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 4 15:42:28.757442 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 4 15:42:28.757503 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 4 15:42:28.757557 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 4 15:42:28.757608 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 4 15:42:28.757660 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 4 15:42:28.757711 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 4 15:42:28.757767 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 4 15:42:28.757819 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 4 15:42:28.757872 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 4 15:42:28.757923 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 4 15:42:28.757979 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Sep 4 15:42:28.758031 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 4 15:42:28.758082 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 4 15:42:28.758133 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 4 15:42:28.758183 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Sep 4 15:42:28.758236 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 4 15:42:28.758289 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 4 15:42:28.758340 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 4 15:42:28.759403 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 4 15:42:28.759476 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 4 15:42:28.759532 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 4 15:42:28.759587 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 4 15:42:28.759639 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 4 15:42:28.759691 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 4 15:42:28.759742 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 4 15:42:28.759793 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 4 15:42:28.759848 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 4 15:42:28.759899 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 4 15:42:28.759952 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 4 15:42:28.760004 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 4 15:42:28.760054 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 4 15:42:28.760104 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 4 15:42:28.760158 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 4 15:42:28.760210 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 4 15:42:28.760260 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 4 15:42:28.760311 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 4 15:42:28.760361 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 4 15:42:28.762466 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 4 15:42:28.762530 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 4 15:42:28.762585 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 4 15:42:28.762641 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 4 15:42:28.762694 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 4 15:42:28.762747 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 4 15:42:28.762799 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 4 15:42:28.762850 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 4 15:42:28.762902 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 4 15:42:28.762956 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 4 15:42:28.763007 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 4 15:42:28.763059 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 4 15:42:28.763110 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 4 15:42:28.763164 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 4 15:42:28.763214 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 4 15:42:28.763266 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 4 15:42:28.763317 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 4 15:42:28.763368 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 4 15:42:28.763445 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 4 15:42:28.763499 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 4 15:42:28.763553 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 4 15:42:28.763605 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 4 15:42:28.763657 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 4 15:42:28.763708 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 4 15:42:28.763759 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 4 15:42:28.763810 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 4 15:42:28.763861 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 4 15:42:28.763912 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 4 15:42:28.763967 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 4 15:42:28.764017 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 4 15:42:28.764067 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 4 15:42:28.764117 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 4 15:42:28.764168 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 4 15:42:28.764218 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 4 15:42:28.764269 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 4 15:42:28.764319 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 4 15:42:28.764369 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 4 15:42:28.764433 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 4 15:42:28.764484 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 4 15:42:28.764537 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 4 15:42:28.764587 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 4 15:42:28.764638 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 4 15:42:28.764690 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 4 15:42:28.764752 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 4 15:42:28.764806 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 4 15:42:28.764858 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 4 15:42:28.764909 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 4 15:42:28.764959 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 4 15:42:28.765012 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 4 15:42:28.765071 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 4 15:42:28.765121 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 4 15:42:28.765183 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 4 15:42:28.765241 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 4 15:42:28.765299 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 4 15:42:28.765356 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Sep 4 15:42:28.765425 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 4 15:42:28.765471 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 4 15:42:28.765516 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Sep 4 15:42:28.765560 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Sep 4 15:42:28.765611 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Sep 4 15:42:28.765659 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Sep 4 15:42:28.765705 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 4 15:42:28.765750 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Sep 4 15:42:28.765795 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 4 15:42:28.765840 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 4 15:42:28.765885 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Sep 4 15:42:28.765932 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Sep 4 15:42:28.766002 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Sep 4 15:42:28.766055 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Sep 4 15:42:28.766102 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Sep 4 15:42:28.766158 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Sep 4 15:42:28.766208 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Sep 4 15:42:28.766271 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Sep 4 15:42:28.766331 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Sep 4 15:42:28.766417 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Sep 4 15:42:28.766477 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Sep 4 15:42:28.766538 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Sep 4 15:42:28.766585 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Sep 4 15:42:28.766635 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Sep 4 15:42:28.766683 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 4 15:42:28.766734 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Sep 4 15:42:28.766782 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Sep 4 15:42:28.766841 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Sep 4 15:42:28.766905 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Sep 4 15:42:28.766961 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Sep 4 15:42:28.767015 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Sep 4 15:42:28.767072 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Sep 4 15:42:28.767130 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Sep 4 15:42:28.767187 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Sep 4 15:42:28.767245 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Sep 4 15:42:28.767292 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Sep 4 15:42:28.767344 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Sep 4 15:42:28.767440 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Sep 4 15:42:28.767489 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Sep 4 15:42:28.767535 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Sep 4 15:42:28.767610 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Sep 4 15:42:28.767669 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 4 15:42:28.767744 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Sep 4 15:42:28.767805 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 4 15:42:28.767859 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Sep 4 15:42:28.767905 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Sep 4 15:42:28.767958 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Sep 4 15:42:28.768004 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Sep 4 15:42:28.768054 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Sep 4 15:42:28.768103 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 4 15:42:28.768157 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Sep 4 15:42:28.768203 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Sep 4 15:42:28.768249 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 4 15:42:28.768299 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Sep 4 15:42:28.768346 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Sep 4 15:42:28.768423 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Sep 4 15:42:28.768476 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Sep 4 15:42:28.768522 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Sep 4 15:42:28.768567 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Sep 4 15:42:28.768623 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Sep 4 15:42:28.768686 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 4 15:42:28.768737 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Sep 4 15:42:28.768793 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 4 15:42:28.768843 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Sep 4 15:42:28.768889 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Sep 4 15:42:28.768939 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Sep 4 15:42:28.768985 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Sep 4 15:42:28.769040 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Sep 4 15:42:28.769089 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 4 15:42:28.769145 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Sep 4 15:42:28.769397 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Sep 4 15:42:28.769484 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Sep 4 15:42:28.769560 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Sep 4 15:42:28.769622 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Sep 4 15:42:28.769670 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Sep 4 15:42:28.769724 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Sep 4 15:42:28.769770 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Sep 4 15:42:28.769823 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Sep 4 15:42:28.769872 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 4 15:42:28.769951 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Sep 4 15:42:28.770024 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Sep 4 15:42:28.770108 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Sep 4 15:42:28.770186 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Sep 4 15:42:28.770262 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Sep 4 15:42:28.770310 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Sep 4 15:42:28.770363 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Sep 4 15:42:28.770494 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 4 15:42:28.770555 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 4 15:42:28.770565 kernel: PCI: CLS 32 bytes, default 64 Sep 4 15:42:28.770571 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 4 15:42:28.770578 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 4 15:42:28.770584 kernel: clocksource: Switched to clocksource tsc Sep 4 15:42:28.770590 kernel: Initialise system trusted keyrings Sep 4 15:42:28.770596 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 4 15:42:28.770602 kernel: Key type asymmetric registered Sep 4 15:42:28.770608 kernel: Asymmetric key parser 'x509' registered Sep 4 15:42:28.770616 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 4 15:42:28.770622 kernel: io scheduler mq-deadline registered Sep 4 15:42:28.770628 kernel: io scheduler kyber registered Sep 4 15:42:28.770634 kernel: io scheduler bfq registered Sep 4 15:42:28.770687 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Sep 4 15:42:28.770746 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.770819 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Sep 4 15:42:28.770874 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.771113 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Sep 4 15:42:28.771168 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.771221 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Sep 4 15:42:28.771272 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.771324 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Sep 4 15:42:28.771397 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.771595 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Sep 4 15:42:28.771653 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.771707 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Sep 4 15:42:28.771757 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.771809 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Sep 4 15:42:28.771860 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.771913 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Sep 4 15:42:28.771963 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.772018 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Sep 4 15:42:28.772068 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.772119 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Sep 4 15:42:28.772170 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.772223 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Sep 4 15:42:28.772282 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.772336 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Sep 4 15:42:28.772403 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.772458 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Sep 4 15:42:28.772510 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.772563 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Sep 4 15:42:28.772613 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.772665 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Sep 4 15:42:28.772717 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.772769 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Sep 4 15:42:28.773033 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.773094 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Sep 4 15:42:28.773147 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.773200 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Sep 4 15:42:28.773255 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.773308 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Sep 4 15:42:28.773360 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.773592 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Sep 4 15:42:28.773645 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.773698 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Sep 4 15:42:28.773749 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.773801 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Sep 4 15:42:28.773852 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774059 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Sep 4 15:42:28.774115 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774172 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Sep 4 15:42:28.774223 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774275 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Sep 4 15:42:28.774326 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774397 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Sep 4 15:42:28.774453 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774505 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Sep 4 15:42:28.774559 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774611 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Sep 4 15:42:28.774663 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774715 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Sep 4 15:42:28.774772 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774823 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Sep 4 15:42:28.774874 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774926 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Sep 4 15:42:28.774980 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 4 15:42:28.774992 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 4 15:42:28.774999 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 4 15:42:28.775005 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 4 15:42:28.775012 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Sep 4 15:42:28.775019 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 4 15:42:28.775026 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 4 15:42:28.775079 kernel: rtc_cmos 00:01: registered as rtc0 Sep 4 15:42:28.775128 kernel: rtc_cmos 00:01: setting system clock to 2025-09-04T15:42:28 UTC (1757000548) Sep 4 15:42:28.775138 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 4 15:42:28.775181 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Sep 4 15:42:28.775190 kernel: intel_pstate: CPU model not supported Sep 4 15:42:28.775196 kernel: NET: Registered PF_INET6 protocol family Sep 4 15:42:28.775203 kernel: Segment Routing with IPv6 Sep 4 15:42:28.775209 kernel: In-situ OAM (IOAM) with IPv6 Sep 4 15:42:28.775217 kernel: NET: Registered PF_PACKET protocol family Sep 4 15:42:28.775223 kernel: Key type dns_resolver registered Sep 4 15:42:28.775229 kernel: IPI shorthand broadcast: enabled Sep 4 15:42:28.775236 kernel: sched_clock: Marking stable (2676004294, 173256382)->(2863498962, -14238286) Sep 4 15:42:28.775242 kernel: registered taskstats version 1 Sep 4 15:42:28.775248 kernel: Loading compiled-in X.509 certificates Sep 4 15:42:28.775254 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 1106dff6b31a2cb943a47c73d0d8dff07e2a7490' Sep 4 15:42:28.775261 kernel: Demotion targets for Node 0: null Sep 4 15:42:28.775267 kernel: Key type .fscrypt registered Sep 4 15:42:28.775274 kernel: Key type fscrypt-provisioning registered Sep 4 15:42:28.775280 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 4 15:42:28.775287 kernel: ima: Allocated hash algorithm: sha1 Sep 4 15:42:28.775293 kernel: ima: No architecture policies found Sep 4 15:42:28.775299 kernel: clk: Disabling unused clocks Sep 4 15:42:28.775306 kernel: Warning: unable to open an initial console. Sep 4 15:42:28.775312 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 4 15:42:28.775319 kernel: Write protecting the kernel read-only data: 24576k Sep 4 15:42:28.775325 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 4 15:42:28.775332 kernel: Run /init as init process Sep 4 15:42:28.775338 kernel: with arguments: Sep 4 15:42:28.775345 kernel: /init Sep 4 15:42:28.775351 kernel: with environment: Sep 4 15:42:28.775357 kernel: HOME=/ Sep 4 15:42:28.775363 kernel: TERM=linux Sep 4 15:42:28.775369 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 4 15:42:28.775386 systemd[1]: Successfully made /usr/ read-only. Sep 4 15:42:28.775395 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 15:42:28.775492 systemd[1]: Detected virtualization vmware. Sep 4 15:42:28.775499 systemd[1]: Detected architecture x86-64. Sep 4 15:42:28.775505 systemd[1]: Running in initrd. Sep 4 15:42:28.775511 systemd[1]: No hostname configured, using default hostname. Sep 4 15:42:28.775518 systemd[1]: Hostname set to . Sep 4 15:42:28.775524 systemd[1]: Initializing machine ID from random generator. Sep 4 15:42:28.775531 systemd[1]: Queued start job for default target initrd.target. Sep 4 15:42:28.775539 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:42:28.775546 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:42:28.775553 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 4 15:42:28.775559 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 15:42:28.775566 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 4 15:42:28.775573 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 4 15:42:28.775580 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 4 15:42:28.775588 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 4 15:42:28.775594 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:42:28.775601 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:42:28.775607 systemd[1]: Reached target paths.target - Path Units. Sep 4 15:42:28.775613 systemd[1]: Reached target slices.target - Slice Units. Sep 4 15:42:28.775620 systemd[1]: Reached target swap.target - Swaps. Sep 4 15:42:28.775626 systemd[1]: Reached target timers.target - Timer Units. Sep 4 15:42:28.775632 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 15:42:28.775640 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 15:42:28.775647 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 4 15:42:28.775653 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 4 15:42:28.775660 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:42:28.775666 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 15:42:28.775673 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:42:28.775679 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 15:42:28.775686 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 4 15:42:28.775692 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 15:42:28.775700 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 4 15:42:28.775707 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 4 15:42:28.775714 systemd[1]: Starting systemd-fsck-usr.service... Sep 4 15:42:28.775720 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 15:42:28.775726 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 15:42:28.775733 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:42:28.775739 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 4 15:42:28.775747 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:42:28.775754 systemd[1]: Finished systemd-fsck-usr.service. Sep 4 15:42:28.775760 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 15:42:28.775782 systemd-journald[243]: Collecting audit messages is disabled. Sep 4 15:42:28.775801 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 15:42:28.775808 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 4 15:42:28.775815 kernel: Bridge firewalling registered Sep 4 15:42:28.775821 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 15:42:28.775828 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 15:42:28.775834 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 15:42:28.775842 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:42:28.775848 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 15:42:28.775855 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:42:28.775861 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:42:28.775869 systemd-journald[243]: Journal started Sep 4 15:42:28.775883 systemd-journald[243]: Runtime Journal (/run/log/journal/47713df5189045c48f78986de524cbc7) is 4.8M, max 38.8M, 34M free. Sep 4 15:42:28.719154 systemd-modules-load[245]: Inserted module 'overlay' Sep 4 15:42:28.739013 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 4 15:42:28.777235 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 15:42:28.777494 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 15:42:28.779170 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 4 15:42:28.780441 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 15:42:28.790479 systemd-tmpfiles[280]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 4 15:42:28.792346 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:42:28.793552 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 15:42:28.794650 dracut-cmdline[279]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:42:28.828321 systemd-resolved[292]: Positive Trust Anchors: Sep 4 15:42:28.828329 systemd-resolved[292]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 15:42:28.828352 systemd-resolved[292]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 15:42:28.830343 systemd-resolved[292]: Defaulting to hostname 'linux'. Sep 4 15:42:28.830933 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 15:42:28.831176 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:42:28.856391 kernel: SCSI subsystem initialized Sep 4 15:42:28.874392 kernel: Loading iSCSI transport class v2.0-870. Sep 4 15:42:28.883387 kernel: iscsi: registered transport (tcp) Sep 4 15:42:28.914391 kernel: iscsi: registered transport (qla4xxx) Sep 4 15:42:28.914438 kernel: QLogic iSCSI HBA Driver Sep 4 15:42:28.925788 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 15:42:28.936131 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:42:28.936628 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 15:42:28.959892 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 4 15:42:28.960968 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 4 15:42:28.999394 kernel: raid6: avx2x4 gen() 46199 MB/s Sep 4 15:42:29.016390 kernel: raid6: avx2x2 gen() 51257 MB/s Sep 4 15:42:29.033604 kernel: raid6: avx2x1 gen() 43590 MB/s Sep 4 15:42:29.033648 kernel: raid6: using algorithm avx2x2 gen() 51257 MB/s Sep 4 15:42:29.051607 kernel: raid6: .... xor() 30469 MB/s, rmw enabled Sep 4 15:42:29.051655 kernel: raid6: using avx2x2 recovery algorithm Sep 4 15:42:29.066387 kernel: xor: automatically using best checksumming function avx Sep 4 15:42:29.171458 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 4 15:42:29.174894 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 4 15:42:29.176169 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:42:29.195561 systemd-udevd[493]: Using default interface naming scheme 'v255'. Sep 4 15:42:29.198947 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:42:29.199570 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 4 15:42:29.216706 dracut-pre-trigger[496]: rd.md=0: removing MD RAID activation Sep 4 15:42:29.230761 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 15:42:29.231600 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 15:42:29.305592 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:42:29.306874 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 4 15:42:29.369387 kernel: VMware PVSCSI driver - version 1.0.7.0-k Sep 4 15:42:29.371384 kernel: vmw_pvscsi: using 64bit dma Sep 4 15:42:29.377386 kernel: vmw_pvscsi: max_id: 16 Sep 4 15:42:29.377406 kernel: vmw_pvscsi: setting ring_pages to 8 Sep 4 15:42:29.383385 kernel: vmw_pvscsi: enabling reqCallThreshold Sep 4 15:42:29.383412 kernel: vmw_pvscsi: driver-based request coalescing enabled Sep 4 15:42:29.383421 kernel: vmw_pvscsi: using MSI-X Sep 4 15:42:29.387284 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Sep 4 15:42:29.387317 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Sep 4 15:42:29.395391 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Sep 4 15:42:29.401996 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Sep 4 15:42:29.402123 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Sep 4 15:42:29.403056 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Sep 4 15:42:29.413387 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Sep 4 15:42:29.416418 (udev-worker)[537]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 4 15:42:29.418891 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 15:42:29.418964 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:42:29.419336 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:42:29.419860 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:42:29.428393 kernel: cryptd: max_cpu_qlen set to 1000 Sep 4 15:42:29.430409 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Sep 4 15:42:29.430508 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 4 15:42:29.431459 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Sep 4 15:42:29.433072 kernel: sd 0:0:0:0: [sda] Cache data unavailable Sep 4 15:42:29.433154 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Sep 4 15:42:29.435385 kernel: libata version 3.00 loaded. Sep 4 15:42:29.441583 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Sep 4 15:42:29.441607 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 4 15:42:29.442725 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 4 15:42:29.454859 kernel: AES CTR mode by8 optimization enabled Sep 4 15:42:29.456406 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:42:29.457387 kernel: ata_piix 0000:00:07.1: version 2.13 Sep 4 15:42:29.457490 kernel: scsi host1: ata_piix Sep 4 15:42:29.459895 kernel: scsi host2: ata_piix Sep 4 15:42:29.459976 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Sep 4 15:42:29.459985 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Sep 4 15:42:29.496818 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Sep 4 15:42:29.502255 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Sep 4 15:42:29.507699 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 4 15:42:29.512128 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Sep 4 15:42:29.512368 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Sep 4 15:42:29.513067 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 4 15:42:29.551388 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 4 15:42:29.629989 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Sep 4 15:42:29.635391 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Sep 4 15:42:29.662423 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Sep 4 15:42:29.662609 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 4 15:42:29.675420 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 4 15:42:29.976135 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 4 15:42:29.976502 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 15:42:29.976645 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:42:29.976843 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 15:42:29.977495 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 4 15:42:29.987459 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 4 15:42:30.569154 disk-uuid[634]: The operation has completed successfully. Sep 4 15:42:30.569422 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 4 15:42:30.604470 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 4 15:42:30.604697 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 4 15:42:30.621203 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 4 15:42:30.631254 sh[675]: Success Sep 4 15:42:30.644575 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 4 15:42:30.644617 kernel: device-mapper: uevent: version 1.0.3 Sep 4 15:42:30.645791 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 4 15:42:30.653388 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 4 15:42:30.681199 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 4 15:42:30.682238 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 4 15:42:30.688496 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 4 15:42:30.698382 kernel: BTRFS: device fsid 03d586f6-54f4-4e78-a040-c693154b15e4 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (687) Sep 4 15:42:30.700591 kernel: BTRFS info (device dm-0): first mount of filesystem 03d586f6-54f4-4e78-a040-c693154b15e4 Sep 4 15:42:30.700610 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:42:30.706399 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 4 15:42:30.706414 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 4 15:42:30.707816 kernel: BTRFS info (device dm-0): enabling free space tree Sep 4 15:42:30.718391 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 4 15:42:30.718578 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 4 15:42:30.719120 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Sep 4 15:42:30.720445 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 4 15:42:30.759386 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (710) Sep 4 15:42:30.761706 kernel: BTRFS info (device sda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:42:30.761727 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:42:30.765385 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 4 15:42:30.765402 kernel: BTRFS info (device sda6): enabling free space tree Sep 4 15:42:30.769388 kernel: BTRFS info (device sda6): last unmount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:42:30.770672 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 4 15:42:30.771538 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 4 15:42:30.818287 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 4 15:42:30.820400 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 4 15:42:30.898121 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 15:42:30.899429 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 15:42:30.912096 ignition[730]: Ignition 2.22.0 Sep 4 15:42:30.912102 ignition[730]: Stage: fetch-offline Sep 4 15:42:30.912119 ignition[730]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:42:30.912124 ignition[730]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:42:30.912171 ignition[730]: parsed url from cmdline: "" Sep 4 15:42:30.912172 ignition[730]: no config URL provided Sep 4 15:42:30.912175 ignition[730]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 15:42:30.912179 ignition[730]: no config at "/usr/lib/ignition/user.ign" Sep 4 15:42:30.912571 ignition[730]: config successfully fetched Sep 4 15:42:30.912589 ignition[730]: parsing config with SHA512: 60b5ffc9d87029eed7eca09a03a35b19b930d920625ddb452b6ad9d5ca5ca2b2a862daac0c1e486308448a655630a78dbe833e19f084ba14d7d080c060b5812f Sep 4 15:42:30.918036 unknown[730]: fetched base config from "system" Sep 4 15:42:30.919517 unknown[730]: fetched user config from "vmware" Sep 4 15:42:30.919726 ignition[730]: fetch-offline: fetch-offline passed Sep 4 15:42:30.919760 ignition[730]: Ignition finished successfully Sep 4 15:42:30.921142 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 15:42:30.928296 systemd-networkd[867]: lo: Link UP Sep 4 15:42:30.928302 systemd-networkd[867]: lo: Gained carrier Sep 4 15:42:30.929008 systemd-networkd[867]: Enumeration completed Sep 4 15:42:30.929051 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 15:42:30.929183 systemd[1]: Reached target network.target - Network. Sep 4 15:42:30.929553 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 4 15:42:30.929960 systemd-networkd[867]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Sep 4 15:42:30.933170 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 4 15:42:30.933513 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 4 15:42:30.930576 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 4 15:42:30.933228 systemd-networkd[867]: ens192: Link UP Sep 4 15:42:30.933230 systemd-networkd[867]: ens192: Gained carrier Sep 4 15:42:30.951045 ignition[871]: Ignition 2.22.0 Sep 4 15:42:30.951311 ignition[871]: Stage: kargs Sep 4 15:42:30.951513 ignition[871]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:42:30.951645 ignition[871]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:42:30.952335 ignition[871]: kargs: kargs passed Sep 4 15:42:30.952358 ignition[871]: Ignition finished successfully Sep 4 15:42:30.953870 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 4 15:42:30.954528 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 4 15:42:30.972343 ignition[878]: Ignition 2.22.0 Sep 4 15:42:30.972594 ignition[878]: Stage: disks Sep 4 15:42:30.972777 ignition[878]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:42:30.972907 ignition[878]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:42:30.973489 ignition[878]: disks: disks passed Sep 4 15:42:30.973624 ignition[878]: Ignition finished successfully Sep 4 15:42:30.974440 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 4 15:42:30.974777 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 4 15:42:30.974919 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 4 15:42:30.975109 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 15:42:30.975298 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 15:42:30.975475 systemd[1]: Reached target basic.target - Basic System. Sep 4 15:42:30.976133 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 4 15:42:30.995630 systemd-fsck[886]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 4 15:42:30.997302 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 4 15:42:30.997934 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 4 15:42:31.086259 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 4 15:42:31.086493 kernel: EXT4-fs (sda9): mounted filesystem b9579306-9cef-42ea-893b-17169f1ea8af r/w with ordered data mode. Quota mode: none. Sep 4 15:42:31.086603 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 4 15:42:31.087532 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 15:42:31.088187 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 4 15:42:31.089603 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 4 15:42:31.089806 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 4 15:42:31.090019 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 15:42:31.098539 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 4 15:42:31.099242 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 4 15:42:31.104396 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (894) Sep 4 15:42:31.106745 kernel: BTRFS info (device sda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:42:31.106767 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:42:31.111020 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 4 15:42:31.111039 kernel: BTRFS info (device sda6): enabling free space tree Sep 4 15:42:31.112254 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 15:42:31.137563 initrd-setup-root[918]: cut: /sysroot/etc/passwd: No such file or directory Sep 4 15:42:31.140113 initrd-setup-root[925]: cut: /sysroot/etc/group: No such file or directory Sep 4 15:42:31.142574 initrd-setup-root[932]: cut: /sysroot/etc/shadow: No such file or directory Sep 4 15:42:31.144518 initrd-setup-root[939]: cut: /sysroot/etc/gshadow: No such file or directory Sep 4 15:42:31.201529 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 4 15:42:31.202300 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 4 15:42:31.203457 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 4 15:42:31.218523 kernel: BTRFS info (device sda6): last unmount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:42:31.231350 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 4 15:42:31.235753 ignition[1007]: INFO : Ignition 2.22.0 Sep 4 15:42:31.235753 ignition[1007]: INFO : Stage: mount Sep 4 15:42:31.236062 ignition[1007]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:42:31.236062 ignition[1007]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:42:31.236281 ignition[1007]: INFO : mount: mount passed Sep 4 15:42:31.236921 ignition[1007]: INFO : Ignition finished successfully Sep 4 15:42:31.237146 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 4 15:42:31.238026 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 4 15:42:31.718689 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 4 15:42:31.719584 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 15:42:31.735341 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1018) Sep 4 15:42:31.735387 kernel: BTRFS info (device sda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:42:31.735397 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:42:31.739614 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 4 15:42:31.739632 kernel: BTRFS info (device sda6): enabling free space tree Sep 4 15:42:31.740691 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 15:42:31.755397 ignition[1035]: INFO : Ignition 2.22.0 Sep 4 15:42:31.755397 ignition[1035]: INFO : Stage: files Sep 4 15:42:31.755397 ignition[1035]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:42:31.755397 ignition[1035]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:42:31.756019 ignition[1035]: DEBUG : files: compiled without relabeling support, skipping Sep 4 15:42:31.756293 ignition[1035]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 4 15:42:31.756293 ignition[1035]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 4 15:42:31.757700 ignition[1035]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 4 15:42:31.757845 ignition[1035]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 4 15:42:31.757982 ignition[1035]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 4 15:42:31.757919 unknown[1035]: wrote ssh authorized keys file for user: core Sep 4 15:42:31.759681 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 4 15:42:31.759904 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 4 15:42:31.816347 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 15:42:32.056524 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 15:42:32.059101 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 15:42:32.059101 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 15:42:32.059101 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 4 15:42:32.061468 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 4 15:42:32.061468 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 4 15:42:32.061468 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 4 15:42:32.491894 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 4 15:42:32.680527 systemd-networkd[867]: ens192: Gained IPv6LL Sep 4 15:42:32.802577 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 4 15:42:32.802577 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 4 15:42:32.803548 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 4 15:42:32.803548 ignition[1035]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Sep 4 15:42:32.803979 ignition[1035]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 15:42:32.804424 ignition[1035]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 15:42:32.804424 ignition[1035]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Sep 4 15:42:32.804424 ignition[1035]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Sep 4 15:42:32.804914 ignition[1035]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 15:42:32.804914 ignition[1035]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 15:42:32.804914 ignition[1035]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Sep 4 15:42:32.804914 ignition[1035]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Sep 4 15:42:32.826467 ignition[1035]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 15:42:32.828421 ignition[1035]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 15:42:32.828589 ignition[1035]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Sep 4 15:42:32.828589 ignition[1035]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Sep 4 15:42:32.828589 ignition[1035]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Sep 4 15:42:32.829684 ignition[1035]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 4 15:42:32.829684 ignition[1035]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 4 15:42:32.829684 ignition[1035]: INFO : files: files passed Sep 4 15:42:32.829684 ignition[1035]: INFO : Ignition finished successfully Sep 4 15:42:32.830296 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 4 15:42:32.831219 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 4 15:42:32.832466 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 4 15:42:32.838552 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 4 15:42:32.838606 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 4 15:42:32.842158 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:42:32.842158 initrd-setup-root-after-ignition[1066]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:42:32.843036 initrd-setup-root-after-ignition[1070]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:42:32.843906 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 15:42:32.844401 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 4 15:42:32.845086 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 4 15:42:32.864485 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 4 15:42:32.864553 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 4 15:42:32.864833 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 4 15:42:32.865094 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 4 15:42:32.865298 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 4 15:42:32.865768 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 4 15:42:32.876521 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 15:42:32.877465 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 4 15:42:32.888009 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:42:32.888199 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:42:32.888458 systemd[1]: Stopped target timers.target - Timer Units. Sep 4 15:42:32.888645 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 4 15:42:32.888715 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 15:42:32.889099 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 4 15:42:32.889267 systemd[1]: Stopped target basic.target - Basic System. Sep 4 15:42:32.889457 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 4 15:42:32.889656 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 15:42:32.889872 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 4 15:42:32.890093 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 4 15:42:32.890300 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 4 15:42:32.890545 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 15:42:32.890762 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 4 15:42:32.890998 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 4 15:42:32.891190 systemd[1]: Stopped target swap.target - Swaps. Sep 4 15:42:32.891363 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 4 15:42:32.891453 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 4 15:42:32.891801 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:42:32.891971 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:42:32.892175 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 4 15:42:32.892245 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:42:32.892414 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 4 15:42:32.892474 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 4 15:42:32.892756 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 4 15:42:32.892819 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 15:42:32.893048 systemd[1]: Stopped target paths.target - Path Units. Sep 4 15:42:32.893196 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 4 15:42:32.893240 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:42:32.893436 systemd[1]: Stopped target slices.target - Slice Units. Sep 4 15:42:32.893642 systemd[1]: Stopped target sockets.target - Socket Units. Sep 4 15:42:32.893843 systemd[1]: iscsid.socket: Deactivated successfully. Sep 4 15:42:32.893892 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 15:42:32.894051 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 4 15:42:32.894095 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 15:42:32.894309 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 4 15:42:32.894399 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 15:42:32.894646 systemd[1]: ignition-files.service: Deactivated successfully. Sep 4 15:42:32.894705 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 4 15:42:32.896493 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 4 15:42:32.896712 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 4 15:42:32.896881 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:42:32.898459 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 4 15:42:32.898591 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 4 15:42:32.898664 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:42:32.898852 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 4 15:42:32.898915 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 15:42:32.901415 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 4 15:42:32.901467 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 4 15:42:32.911620 ignition[1090]: INFO : Ignition 2.22.0 Sep 4 15:42:32.911620 ignition[1090]: INFO : Stage: umount Sep 4 15:42:32.911980 ignition[1090]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:42:32.911980 ignition[1090]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 4 15:42:32.912201 ignition[1090]: INFO : umount: umount passed Sep 4 15:42:32.912317 ignition[1090]: INFO : Ignition finished successfully Sep 4 15:42:32.913300 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 4 15:42:32.913359 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 4 15:42:32.913802 systemd[1]: Stopped target network.target - Network. Sep 4 15:42:32.914533 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 4 15:42:32.914679 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 4 15:42:32.914910 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 4 15:42:32.915028 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 4 15:42:32.915259 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 4 15:42:32.915422 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 4 15:42:32.915616 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 4 15:42:32.915638 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 4 15:42:32.916017 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 4 15:42:32.916448 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 4 15:42:32.918974 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 4 15:42:32.921449 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 4 15:42:32.921652 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 4 15:42:32.923351 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 4 15:42:32.923717 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 4 15:42:32.923752 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:42:32.925212 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 4 15:42:32.925355 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 4 15:42:32.925424 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 4 15:42:32.926056 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 4 15:42:32.926230 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 4 15:42:32.926707 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 4 15:42:32.926734 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:42:32.927412 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 4 15:42:32.928417 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 4 15:42:32.928445 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 15:42:32.928637 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Sep 4 15:42:32.928660 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 4 15:42:32.928827 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 4 15:42:32.928852 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:42:32.930038 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 4 15:42:32.930071 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 4 15:42:32.930294 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:42:32.931748 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 4 15:42:32.936852 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 4 15:42:32.936946 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:42:32.938774 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 4 15:42:32.938818 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 4 15:42:32.939062 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 4 15:42:32.939079 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:42:32.939228 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 4 15:42:32.939253 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 4 15:42:32.939465 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 4 15:42:32.939488 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 4 15:42:32.939755 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 15:42:32.939777 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 15:42:32.941243 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 4 15:42:32.941560 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 4 15:42:32.941586 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:42:32.943356 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 4 15:42:32.943404 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:42:32.943695 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 15:42:32.943719 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:42:32.944188 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 4 15:42:32.944239 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 4 15:42:32.951598 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 4 15:42:32.951667 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 4 15:42:32.990018 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 4 15:42:32.990112 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 4 15:42:32.990382 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 4 15:42:32.990505 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 4 15:42:32.990532 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 4 15:42:32.991124 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 4 15:42:33.015093 systemd[1]: Switching root. Sep 4 15:42:33.058151 systemd-journald[243]: Journal stopped Sep 4 15:42:34.196778 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Sep 4 15:42:34.196800 kernel: SELinux: policy capability network_peer_controls=1 Sep 4 15:42:34.196808 kernel: SELinux: policy capability open_perms=1 Sep 4 15:42:34.196814 kernel: SELinux: policy capability extended_socket_class=1 Sep 4 15:42:34.196819 kernel: SELinux: policy capability always_check_network=0 Sep 4 15:42:34.196826 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 4 15:42:34.196832 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 4 15:42:34.196837 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 4 15:42:34.196843 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 4 15:42:34.196848 kernel: SELinux: policy capability userspace_initial_context=0 Sep 4 15:42:34.196855 systemd[1]: Successfully loaded SELinux policy in 58.391ms. Sep 4 15:42:34.196862 kernel: audit: type=1403 audit(1757000553.640:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 4 15:42:34.196869 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.579ms. Sep 4 15:42:34.196877 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 15:42:34.196884 systemd[1]: Detected virtualization vmware. Sep 4 15:42:34.196890 systemd[1]: Detected architecture x86-64. Sep 4 15:42:34.196898 systemd[1]: Detected first boot. Sep 4 15:42:34.196905 systemd[1]: Initializing machine ID from random generator. Sep 4 15:42:34.196911 zram_generator::config[1133]: No configuration found. Sep 4 15:42:34.196998 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Sep 4 15:42:34.197009 kernel: Guest personality initialized and is active Sep 4 15:42:34.197016 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 4 15:42:34.197022 kernel: Initialized host personality Sep 4 15:42:34.197029 kernel: NET: Registered PF_VSOCK protocol family Sep 4 15:42:34.197036 systemd[1]: Populated /etc with preset unit settings. Sep 4 15:42:34.197044 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:42:34.197051 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Sep 4 15:42:34.197058 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 4 15:42:34.197064 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 4 15:42:34.197071 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 4 15:42:34.197079 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 4 15:42:34.197086 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 4 15:42:34.197092 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 4 15:42:34.197099 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 4 15:42:34.197105 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 4 15:42:34.197112 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 4 15:42:34.197119 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 4 15:42:34.197127 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 4 15:42:34.197133 systemd[1]: Created slice user.slice - User and Session Slice. Sep 4 15:42:34.197141 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:42:34.197149 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:42:34.197156 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 4 15:42:34.197163 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 4 15:42:34.197170 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 4 15:42:34.197177 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 15:42:34.197184 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 4 15:42:34.197191 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:42:34.197198 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:42:34.197204 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 4 15:42:34.197211 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 4 15:42:34.197218 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 4 15:42:34.197225 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 4 15:42:34.197232 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:42:34.197240 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 15:42:34.197246 systemd[1]: Reached target slices.target - Slice Units. Sep 4 15:42:34.197253 systemd[1]: Reached target swap.target - Swaps. Sep 4 15:42:34.197259 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 4 15:42:34.197266 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 4 15:42:34.197274 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 4 15:42:34.197281 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:42:34.197288 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 15:42:34.197295 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:42:34.197302 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 4 15:42:34.197309 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 4 15:42:34.197315 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 4 15:42:34.197322 systemd[1]: Mounting media.mount - External Media Directory... Sep 4 15:42:34.197330 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:42:34.197337 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 4 15:42:34.197344 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 4 15:42:34.197351 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 4 15:42:34.197358 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 4 15:42:34.197365 systemd[1]: Reached target machines.target - Containers. Sep 4 15:42:34.199388 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 4 15:42:34.199401 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Sep 4 15:42:34.199411 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 15:42:34.199418 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 4 15:42:34.199425 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:42:34.199432 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 15:42:34.199440 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 15:42:34.199447 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 4 15:42:34.199454 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 15:42:34.199461 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 4 15:42:34.199469 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 4 15:42:34.199476 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 4 15:42:34.199483 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 4 15:42:34.199490 systemd[1]: Stopped systemd-fsck-usr.service. Sep 4 15:42:34.199497 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:42:34.199504 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 15:42:34.199511 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 15:42:34.199518 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 15:42:34.199525 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 4 15:42:34.199534 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 4 15:42:34.199541 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 15:42:34.199548 systemd[1]: verity-setup.service: Deactivated successfully. Sep 4 15:42:34.199555 systemd[1]: Stopped verity-setup.service. Sep 4 15:42:34.199562 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:42:34.199569 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 4 15:42:34.199576 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 4 15:42:34.199583 systemd[1]: Mounted media.mount - External Media Directory. Sep 4 15:42:34.199591 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 4 15:42:34.199598 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 4 15:42:34.199605 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 4 15:42:34.199612 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:42:34.199620 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 4 15:42:34.199627 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 4 15:42:34.199633 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:42:34.199655 systemd-journald[1226]: Collecting audit messages is disabled. Sep 4 15:42:34.199673 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:42:34.199680 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 15:42:34.199687 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 15:42:34.199694 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:42:34.199702 systemd-journald[1226]: Journal started Sep 4 15:42:34.199719 systemd-journald[1226]: Runtime Journal (/run/log/journal/330aef6b32ee4e4ea92c5bb17f732978) is 4.8M, max 38.8M, 34M free. Sep 4 15:42:34.036519 systemd[1]: Queued start job for default target multi-user.target. Sep 4 15:42:34.047420 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 4 15:42:34.047654 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 4 15:42:34.202139 jq[1203]: true Sep 4 15:42:34.203394 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 4 15:42:34.205010 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 15:42:34.205032 jq[1246]: true Sep 4 15:42:34.212382 kernel: fuse: init (API version 7.41) Sep 4 15:42:34.212714 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 4 15:42:34.212849 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 4 15:42:34.213150 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 4 15:42:34.213738 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 15:42:34.215848 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 4 15:42:34.219541 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 15:42:34.221993 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 4 15:42:34.224409 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 4 15:42:34.224541 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 4 15:42:34.224560 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 15:42:34.225231 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 4 15:42:34.233005 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 4 15:42:34.233191 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:42:34.234446 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 4 15:42:34.236503 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 4 15:42:34.236643 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 15:42:34.239138 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 4 15:42:34.249900 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 15:42:34.251623 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 4 15:42:34.256041 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 4 15:42:34.257136 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 4 15:42:34.257689 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 4 15:42:34.269495 systemd-journald[1226]: Time spent on flushing to /var/log/journal/330aef6b32ee4e4ea92c5bb17f732978 is 24.493ms for 1752 entries. Sep 4 15:42:34.269495 systemd-journald[1226]: System Journal (/var/log/journal/330aef6b32ee4e4ea92c5bb17f732978) is 8M, max 584.8M, 576.8M free. Sep 4 15:42:34.397688 systemd-journald[1226]: Received client request to flush runtime journal. Sep 4 15:42:34.397727 kernel: ACPI: bus type drm_connector registered Sep 4 15:42:34.397742 kernel: loop: module loaded Sep 4 15:42:34.397753 kernel: loop0: detected capacity change from 0 to 229808 Sep 4 15:42:34.271910 ignition[1250]: Ignition 2.22.0 Sep 4 15:42:34.339326 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 15:42:34.272074 ignition[1250]: deleting config from guestinfo properties Sep 4 15:42:34.339821 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 15:42:34.340939 ignition[1250]: Successfully deleted config Sep 4 15:42:34.340610 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 15:42:34.340978 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 15:42:34.341325 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 4 15:42:34.342001 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 4 15:42:34.347462 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 4 15:42:34.347612 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 15:42:34.347846 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Sep 4 15:42:34.376838 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:42:34.399407 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 4 15:42:34.404083 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 4 15:42:34.420038 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 4 15:42:34.426254 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 15:42:34.426381 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 4 15:42:34.441451 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:42:34.446385 kernel: loop1: detected capacity change from 0 to 2960 Sep 4 15:42:34.454034 systemd-tmpfiles[1301]: ACLs are not supported, ignoring. Sep 4 15:42:34.454228 systemd-tmpfiles[1301]: ACLs are not supported, ignoring. Sep 4 15:42:34.457050 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:42:34.473945 kernel: loop2: detected capacity change from 0 to 128016 Sep 4 15:42:34.506645 kernel: loop3: detected capacity change from 0 to 110984 Sep 4 15:42:34.537607 kernel: loop4: detected capacity change from 0 to 229808 Sep 4 15:42:34.568413 kernel: loop5: detected capacity change from 0 to 2960 Sep 4 15:42:34.592098 kernel: loop6: detected capacity change from 0 to 128016 Sep 4 15:42:34.808395 kernel: loop7: detected capacity change from 0 to 110984 Sep 4 15:42:34.981641 (sd-merge)[1308]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Sep 4 15:42:34.981911 (sd-merge)[1308]: Merged extensions into '/usr'. Sep 4 15:42:34.985974 systemd[1]: Reload requested from client PID 1281 ('systemd-sysext') (unit systemd-sysext.service)... Sep 4 15:42:34.985984 systemd[1]: Reloading... Sep 4 15:42:35.067499 zram_generator::config[1334]: No configuration found. Sep 4 15:42:35.155579 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:42:35.200118 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 4 15:42:35.200364 systemd[1]: Reloading finished in 214 ms. Sep 4 15:42:35.216285 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 4 15:42:35.216625 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 4 15:42:35.223120 systemd[1]: Starting ensure-sysext.service... Sep 4 15:42:35.225436 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 15:42:35.226347 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:42:35.240882 systemd[1]: Reload requested from client PID 1390 ('systemctl') (unit ensure-sysext.service)... Sep 4 15:42:35.240892 systemd[1]: Reloading... Sep 4 15:42:35.243151 ldconfig[1271]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 4 15:42:35.244323 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 4 15:42:35.244342 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 4 15:42:35.245132 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 4 15:42:35.245293 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 4 15:42:35.245787 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 4 15:42:35.245948 systemd-tmpfiles[1391]: ACLs are not supported, ignoring. Sep 4 15:42:35.245981 systemd-tmpfiles[1391]: ACLs are not supported, ignoring. Sep 4 15:42:35.248140 systemd-tmpfiles[1391]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 15:42:35.248146 systemd-tmpfiles[1391]: Skipping /boot Sep 4 15:42:35.251820 systemd-tmpfiles[1391]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 15:42:35.251827 systemd-tmpfiles[1391]: Skipping /boot Sep 4 15:42:35.259940 systemd-udevd[1392]: Using default interface naming scheme 'v255'. Sep 4 15:42:35.294891 zram_generator::config[1430]: No configuration found. Sep 4 15:42:35.416397 kernel: mousedev: PS/2 mouse device common for all mice Sep 4 15:42:35.430212 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:42:35.436395 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 4 15:42:35.463402 kernel: ACPI: button: Power Button [PWRF] Sep 4 15:42:35.489870 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 4 15:42:35.489930 systemd[1]: Reloading finished in 248 ms. Sep 4 15:42:35.495193 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:42:35.495777 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 4 15:42:35.501502 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:42:35.520647 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 4 15:42:35.521882 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 15:42:35.524503 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 4 15:42:35.526786 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 4 15:42:35.532987 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 4 15:42:35.535242 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 15:42:35.538656 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 15:42:35.541199 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 4 15:42:35.543354 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:42:35.546919 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:42:35.556615 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 15:42:35.561696 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 15:42:35.561949 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:42:35.562065 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:42:35.562174 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:42:35.568354 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 4 15:42:35.574347 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 4 15:42:35.577631 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:42:35.577942 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:42:35.580890 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:42:35.582525 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:42:35.584519 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 15:42:35.584937 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:42:35.585056 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:42:35.588261 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 4 15:42:35.588487 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:42:35.589058 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 15:42:35.589516 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 15:42:35.589915 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 15:42:35.590237 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 15:42:35.592874 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 4 15:42:35.593830 systemd[1]: Finished ensure-sysext.service. Sep 4 15:42:35.596843 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 15:42:35.602385 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 4 15:42:35.605504 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 4 15:42:35.605874 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:42:35.606012 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:42:35.608235 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 15:42:35.608617 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 15:42:35.610229 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 15:42:35.615620 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Sep 4 15:42:35.619922 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 4 15:42:35.620178 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 4 15:42:35.627546 augenrules[1560]: No rules Sep 4 15:42:35.628892 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 15:42:35.629338 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 15:42:35.632698 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 4 15:42:35.651414 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 4 15:42:35.711904 (udev-worker)[1422]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 4 15:42:35.729981 systemd-resolved[1519]: Positive Trust Anchors: Sep 4 15:42:35.729995 systemd-resolved[1519]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 15:42:35.730019 systemd-resolved[1519]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 15:42:35.746691 systemd-resolved[1519]: Defaulting to hostname 'linux'. Sep 4 15:42:35.749854 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 15:42:35.750025 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:42:35.759320 systemd-networkd[1518]: lo: Link UP Sep 4 15:42:35.759326 systemd-networkd[1518]: lo: Gained carrier Sep 4 15:42:35.760225 systemd-networkd[1518]: Enumeration completed Sep 4 15:42:35.760294 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 15:42:35.760536 systemd[1]: Reached target network.target - Network. Sep 4 15:42:35.760823 systemd-networkd[1518]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Sep 4 15:42:35.762333 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 4 15:42:35.763492 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 4 15:42:35.763634 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 4 15:42:35.763527 systemd-networkd[1518]: ens192: Link UP Sep 4 15:42:35.763831 systemd-networkd[1518]: ens192: Gained carrier Sep 4 15:42:35.766529 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 4 15:42:35.767478 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 4 15:42:35.767631 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 15:42:35.767770 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 4 15:42:35.767887 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 4 15:42:35.767991 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 4 15:42:35.768093 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 4 15:42:35.768192 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 4 15:42:35.768206 systemd[1]: Reached target paths.target - Path Units. Sep 4 15:42:35.768285 systemd[1]: Reached target time-set.target - System Time Set. Sep 4 15:42:35.768533 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 4 15:42:35.769671 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 4 15:42:35.769773 systemd[1]: Reached target timers.target - Timer Units. Sep 4 15:42:35.770764 systemd-timesyncd[1552]: Network configuration changed, trying to establish connection. Sep 4 15:42:35.780311 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 4 15:42:35.783298 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 4 15:42:35.786607 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 4 15:42:35.786858 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 4 15:42:35.786970 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 4 15:42:35.794852 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 4 15:42:35.797711 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 4 15:42:35.798857 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 4 15:44:08.340555 systemd-resolved[1519]: Clock change detected. Flushing caches. Sep 4 15:44:08.340637 systemd-timesyncd[1552]: Contacted time server 23.186.168.125:123 (0.flatcar.pool.ntp.org). Sep 4 15:44:08.340667 systemd-timesyncd[1552]: Initial clock synchronization to Thu 2025-09-04 15:44:08.340523 UTC. Sep 4 15:44:08.352115 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 15:44:08.352270 systemd[1]: Reached target basic.target - Basic System. Sep 4 15:44:08.352404 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 4 15:44:08.352424 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 4 15:44:08.354905 systemd[1]: Starting containerd.service - containerd container runtime... Sep 4 15:44:08.356589 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 4 15:44:08.359364 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 4 15:44:08.363327 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 4 15:44:08.367046 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 4 15:44:08.367183 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 4 15:44:08.370497 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 4 15:44:08.373343 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 4 15:44:08.375171 jq[1598]: false Sep 4 15:44:08.376285 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 4 15:44:08.382040 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 4 15:44:08.384310 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 4 15:44:08.387388 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 4 15:44:08.389741 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:44:08.390403 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 4 15:44:08.391043 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 4 15:44:08.392319 systemd[1]: Starting update-engine.service - Update Engine... Sep 4 15:44:08.394011 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 4 15:44:08.395007 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Sep 4 15:44:08.396252 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 4 15:44:08.401850 extend-filesystems[1599]: Found /dev/sda6 Sep 4 15:44:08.406539 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 4 15:44:08.406967 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 4 15:44:08.407455 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 4 15:44:08.411644 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 4 15:44:08.411800 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 4 15:44:08.419475 jq[1611]: true Sep 4 15:44:08.420141 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Refreshing passwd entry cache Sep 4 15:44:08.421207 extend-filesystems[1599]: Found /dev/sda9 Sep 4 15:44:08.421562 oslogin_cache_refresh[1600]: Refreshing passwd entry cache Sep 4 15:44:08.423329 extend-filesystems[1599]: Checking size of /dev/sda9 Sep 4 15:44:08.435223 (ntainerd)[1628]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 4 15:44:08.436409 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Sep 4 15:44:08.437761 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Sep 4 15:44:08.446757 update_engine[1610]: I20250904 15:44:08.445698 1610 main.cc:92] Flatcar Update Engine starting Sep 4 15:44:08.445774 oslogin_cache_refresh[1600]: Failure getting users, quitting Sep 4 15:44:08.446953 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Failure getting users, quitting Sep 4 15:44:08.446953 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 15:44:08.446953 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Refreshing group entry cache Sep 4 15:44:08.445785 oslogin_cache_refresh[1600]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 15:44:08.445816 oslogin_cache_refresh[1600]: Refreshing group entry cache Sep 4 15:44:08.455273 jq[1631]: true Sep 4 15:44:08.453190 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 4 15:44:08.452765 oslogin_cache_refresh[1600]: Failure getting groups, quitting Sep 4 15:44:08.455510 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Failure getting groups, quitting Sep 4 15:44:08.455510 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 15:44:08.455538 extend-filesystems[1599]: Old size kept for /dev/sda9 Sep 4 15:44:08.454265 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 4 15:44:08.452773 oslogin_cache_refresh[1600]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 15:44:08.486668 unknown[1638]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Sep 4 15:44:08.487168 unknown[1638]: Core dump limit set to -1 Sep 4 15:44:08.507903 systemd-logind[1608]: Watching system buttons on /dev/input/event2 (Power Button) Sep 4 15:44:08.508082 systemd-logind[1608]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 4 15:44:08.508603 systemd-logind[1608]: New seat seat0. Sep 4 15:44:08.565710 sshd_keygen[1646]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 4 15:44:08.595190 dbus-daemon[1596]: [system] SELinux support is enabled Sep 4 15:44:08.598743 update_engine[1610]: I20250904 15:44:08.598703 1610 update_check_scheduler.cc:74] Next update check in 11m31s Sep 4 15:44:08.825512 containerd[1628]: time="2025-09-04T15:44:08Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 4 15:44:08.826252 containerd[1628]: time="2025-09-04T15:44:08.825864102Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 4 15:44:08.830947 containerd[1628]: time="2025-09-04T15:44:08.830932057Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="4.619µs" Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.830985814Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.830999837Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831071087Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831080714Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831126978Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831163185Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831170325Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831291320Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831301530Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831307922Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831312270Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831670 containerd[1628]: time="2025-09-04T15:44:08.831357049Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831845 containerd[1628]: time="2025-09-04T15:44:08.831469940Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831845 containerd[1628]: time="2025-09-04T15:44:08.831484759Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 15:44:08.831845 containerd[1628]: time="2025-09-04T15:44:08.831490862Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 4 15:44:08.831845 containerd[1628]: time="2025-09-04T15:44:08.831508091Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 4 15:44:08.831845 containerd[1628]: time="2025-09-04T15:44:08.831689193Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 4 15:44:08.831845 containerd[1628]: time="2025-09-04T15:44:08.831736659Z" level=info msg="metadata content store policy set" policy=shared Sep 4 15:44:08.874142 containerd[1628]: time="2025-09-04T15:44:08.874113645Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 4 15:44:08.874254 containerd[1628]: time="2025-09-04T15:44:08.874242159Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 4 15:44:08.874299 containerd[1628]: time="2025-09-04T15:44:08.874291088Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 4 15:44:08.874340 containerd[1628]: time="2025-09-04T15:44:08.874332837Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 4 15:44:08.874447 containerd[1628]: time="2025-09-04T15:44:08.874435174Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 4 15:44:08.874484 containerd[1628]: time="2025-09-04T15:44:08.874477094Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 4 15:44:08.874517 containerd[1628]: time="2025-09-04T15:44:08.874510701Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 4 15:44:08.874560 containerd[1628]: time="2025-09-04T15:44:08.874551974Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 4 15:44:08.874620 containerd[1628]: time="2025-09-04T15:44:08.874612129Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 4 15:44:08.874655 containerd[1628]: time="2025-09-04T15:44:08.874648067Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 4 15:44:08.874688 containerd[1628]: time="2025-09-04T15:44:08.874680865Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 4 15:44:08.874727 containerd[1628]: time="2025-09-04T15:44:08.874719213Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 4 15:44:08.874834 containerd[1628]: time="2025-09-04T15:44:08.874824657Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 4 15:44:08.874874 containerd[1628]: time="2025-09-04T15:44:08.874866944Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874910483Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874921282Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874930323Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874937004Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874947759Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874953886Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874960327Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874966819Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.874972610Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.875011776Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.875020964Z" level=info msg="Start snapshots syncer" Sep 4 15:44:08.876224 containerd[1628]: time="2025-09-04T15:44:08.875041574Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 4 15:44:08.876453 containerd[1628]: time="2025-09-04T15:44:08.875199180Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 4 15:44:08.876453 containerd[1628]: time="2025-09-04T15:44:08.875245264Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875285965Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875342770Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875355592Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875362479Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875368707Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875375081Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875384634Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875392860Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875407086Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875413665Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875420378Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875441915Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875451002Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 15:44:08.876554 containerd[1628]: time="2025-09-04T15:44:08.875456449Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875461917Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875466880Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875472296Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875478592Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875488551Z" level=info msg="runtime interface created" Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875491607Z" level=info msg="created NRI interface" Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875496340Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875502141Z" level=info msg="Connect containerd service" Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.875526096Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 4 15:44:08.876758 containerd[1628]: time="2025-09-04T15:44:08.876249805Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 15:44:08.948634 systemd[1]: Started systemd-logind.service - User Login Management. Sep 4 15:44:08.949286 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 4 15:44:08.951017 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 4 15:44:08.951211 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 4 15:44:08.951504 systemd[1]: motdgen.service: Deactivated successfully. Sep 4 15:44:08.951675 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 4 15:44:08.951972 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 4 15:44:08.952268 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:44:08.953276 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Sep 4 15:44:08.963347 dbus-daemon[1596]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 4 15:44:08.964193 tar[1617]: linux-amd64/LICENSE Sep 4 15:44:08.965735 tar[1617]: linux-amd64/helm Sep 4 15:44:08.970891 systemd[1]: Started update-engine.service - Update Engine. Sep 4 15:44:08.976730 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 4 15:44:08.976874 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 4 15:44:08.976959 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 4 15:44:08.977303 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 4 15:44:08.977498 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 4 15:44:08.986926 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 4 15:44:08.997280 systemd[1]: issuegen.service: Deactivated successfully. Sep 4 15:44:08.997418 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 4 15:44:08.999471 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 4 15:44:09.030938 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 4 15:44:09.032848 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 4 15:44:09.033976 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 4 15:44:09.034362 systemd[1]: Reached target getty.target - Login Prompts. Sep 4 15:44:09.048837 containerd[1628]: time="2025-09-04T15:44:09.048742100Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 4 15:44:09.049335 containerd[1628]: time="2025-09-04T15:44:09.049324941Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 4 15:44:09.049388 containerd[1628]: time="2025-09-04T15:44:09.049285301Z" level=info msg="Start subscribing containerd event" Sep 4 15:44:09.049439 containerd[1628]: time="2025-09-04T15:44:09.049422513Z" level=info msg="Start recovering state" Sep 4 15:44:09.049782 containerd[1628]: time="2025-09-04T15:44:09.049689429Z" level=info msg="Start event monitor" Sep 4 15:44:09.049782 containerd[1628]: time="2025-09-04T15:44:09.049700052Z" level=info msg="Start cni network conf syncer for default" Sep 4 15:44:09.049782 containerd[1628]: time="2025-09-04T15:44:09.049704541Z" level=info msg="Start streaming server" Sep 4 15:44:09.049782 containerd[1628]: time="2025-09-04T15:44:09.049708967Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 4 15:44:09.049782 containerd[1628]: time="2025-09-04T15:44:09.049716522Z" level=info msg="runtime interface starting up..." Sep 4 15:44:09.049782 containerd[1628]: time="2025-09-04T15:44:09.049719913Z" level=info msg="starting plugins..." Sep 4 15:44:09.049782 containerd[1628]: time="2025-09-04T15:44:09.049728374Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 4 15:44:09.050234 containerd[1628]: time="2025-09-04T15:44:09.049910646Z" level=info msg="containerd successfully booted in 0.224664s" Sep 4 15:44:09.050052 systemd[1]: Started containerd.service - containerd container runtime. Sep 4 15:44:09.071578 bash[1702]: Updated "/home/core/.ssh/authorized_keys" Sep 4 15:44:09.072036 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 4 15:44:09.072999 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 4 15:44:09.079548 locksmithd[1690]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 4 15:44:09.218999 tar[1617]: linux-amd64/README.md Sep 4 15:44:09.232448 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 4 15:44:10.020406 systemd-networkd[1518]: ens192: Gained IPv6LL Sep 4 15:44:10.021688 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 4 15:44:10.022571 systemd[1]: Reached target network-online.target - Network is Online. Sep 4 15:44:10.023871 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Sep 4 15:44:10.025531 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:44:10.035175 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 4 15:44:10.063756 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 4 15:44:10.076112 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 4 15:44:10.076309 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Sep 4 15:44:10.076887 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 4 15:44:10.976949 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:44:10.977349 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 4 15:44:10.978210 systemd[1]: Startup finished in 2.727s (kernel) + 5.023s (initrd) + 4.855s (userspace) = 12.605s. Sep 4 15:44:10.983943 (kubelet)[1802]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:44:11.004328 login[1711]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 15:44:11.005416 login[1712]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 15:44:11.011876 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 4 15:44:11.013265 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 4 15:44:11.020116 systemd-logind[1608]: New session 1 of user core. Sep 4 15:44:11.023276 systemd-logind[1608]: New session 2 of user core. Sep 4 15:44:11.031595 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 4 15:44:11.035295 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 4 15:44:11.044089 (systemd)[1809]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 4 15:44:11.046107 systemd-logind[1608]: New session c1 of user core. Sep 4 15:44:11.146707 systemd[1809]: Queued start job for default target default.target. Sep 4 15:44:11.156091 systemd[1809]: Created slice app.slice - User Application Slice. Sep 4 15:44:11.156108 systemd[1809]: Reached target paths.target - Paths. Sep 4 15:44:11.156133 systemd[1809]: Reached target timers.target - Timers. Sep 4 15:44:11.157042 systemd[1809]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 4 15:44:11.166570 systemd[1809]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 4 15:44:11.166733 systemd[1809]: Reached target sockets.target - Sockets. Sep 4 15:44:11.166806 systemd[1809]: Reached target basic.target - Basic System. Sep 4 15:44:11.166881 systemd[1809]: Reached target default.target - Main User Target. Sep 4 15:44:11.166894 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 4 15:44:11.167294 systemd[1809]: Startup finished in 115ms. Sep 4 15:44:11.171290 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 4 15:44:11.171839 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 4 15:44:11.649781 kubelet[1802]: E0904 15:44:11.649744 1802 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:44:11.651548 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:44:11.651637 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:44:11.651993 systemd[1]: kubelet.service: Consumed 640ms CPU time, 265.8M memory peak. Sep 4 15:44:21.901959 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 4 15:44:21.903117 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:44:22.367328 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:44:22.369905 (kubelet)[1854]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:44:22.390907 kubelet[1854]: E0904 15:44:22.390861 1854 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:44:22.393303 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:44:22.393386 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:44:22.393684 systemd[1]: kubelet.service: Consumed 93ms CPU time, 110.3M memory peak. Sep 4 15:44:32.643721 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 4 15:44:32.644938 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:44:32.969562 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:44:32.975610 (kubelet)[1869]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:44:33.043031 kubelet[1869]: E0904 15:44:33.042989 1869 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:44:33.044784 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:44:33.044877 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:44:33.045271 systemd[1]: kubelet.service: Consumed 96ms CPU time, 108.5M memory peak. Sep 4 15:44:39.063976 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 4 15:44:39.065181 systemd[1]: Started sshd@0-139.178.70.109:22-139.178.89.65:48702.service - OpenSSH per-connection server daemon (139.178.89.65:48702). Sep 4 15:44:39.105933 sshd[1877]: Accepted publickey for core from 139.178.89.65 port 48702 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:44:39.106820 sshd-session[1877]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:44:39.110259 systemd-logind[1608]: New session 3 of user core. Sep 4 15:44:39.118316 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 4 15:44:39.172450 systemd[1]: Started sshd@1-139.178.70.109:22-139.178.89.65:48708.service - OpenSSH per-connection server daemon (139.178.89.65:48708). Sep 4 15:44:39.212849 sshd[1883]: Accepted publickey for core from 139.178.89.65 port 48708 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:44:39.213462 sshd-session[1883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:44:39.216243 systemd-logind[1608]: New session 4 of user core. Sep 4 15:44:39.222353 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 4 15:44:39.269934 sshd[1886]: Connection closed by 139.178.89.65 port 48708 Sep 4 15:44:39.270712 sshd-session[1883]: pam_unix(sshd:session): session closed for user core Sep 4 15:44:39.274823 systemd[1]: sshd@1-139.178.70.109:22-139.178.89.65:48708.service: Deactivated successfully. Sep 4 15:44:39.275805 systemd[1]: session-4.scope: Deactivated successfully. Sep 4 15:44:39.276596 systemd-logind[1608]: Session 4 logged out. Waiting for processes to exit. Sep 4 15:44:39.278195 systemd[1]: Started sshd@2-139.178.70.109:22-139.178.89.65:48718.service - OpenSSH per-connection server daemon (139.178.89.65:48718). Sep 4 15:44:39.280463 systemd-logind[1608]: Removed session 4. Sep 4 15:44:39.320995 sshd[1892]: Accepted publickey for core from 139.178.89.65 port 48718 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:44:39.321691 sshd-session[1892]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:44:39.325910 systemd-logind[1608]: New session 5 of user core. Sep 4 15:44:39.332327 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 4 15:44:39.378635 sshd[1895]: Connection closed by 139.178.89.65 port 48718 Sep 4 15:44:39.379045 sshd-session[1892]: pam_unix(sshd:session): session closed for user core Sep 4 15:44:39.390882 systemd[1]: sshd@2-139.178.70.109:22-139.178.89.65:48718.service: Deactivated successfully. Sep 4 15:44:39.391912 systemd[1]: session-5.scope: Deactivated successfully. Sep 4 15:44:39.392876 systemd-logind[1608]: Session 5 logged out. Waiting for processes to exit. Sep 4 15:44:39.394000 systemd[1]: Started sshd@3-139.178.70.109:22-139.178.89.65:48732.service - OpenSSH per-connection server daemon (139.178.89.65:48732). Sep 4 15:44:39.395985 systemd-logind[1608]: Removed session 5. Sep 4 15:44:39.435498 sshd[1901]: Accepted publickey for core from 139.178.89.65 port 48732 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:44:39.436375 sshd-session[1901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:44:39.439574 systemd-logind[1608]: New session 6 of user core. Sep 4 15:44:39.449313 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 4 15:44:39.497004 sshd[1904]: Connection closed by 139.178.89.65 port 48732 Sep 4 15:44:39.497278 sshd-session[1901]: pam_unix(sshd:session): session closed for user core Sep 4 15:44:39.507750 systemd[1]: sshd@3-139.178.70.109:22-139.178.89.65:48732.service: Deactivated successfully. Sep 4 15:44:39.508575 systemd[1]: session-6.scope: Deactivated successfully. Sep 4 15:44:39.509293 systemd-logind[1608]: Session 6 logged out. Waiting for processes to exit. Sep 4 15:44:39.511046 systemd[1]: Started sshd@4-139.178.70.109:22-139.178.89.65:48748.service - OpenSSH per-connection server daemon (139.178.89.65:48748). Sep 4 15:44:39.511700 systemd-logind[1608]: Removed session 6. Sep 4 15:44:39.541855 sshd[1910]: Accepted publickey for core from 139.178.89.65 port 48748 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:44:39.542578 sshd-session[1910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:44:39.545054 systemd-logind[1608]: New session 7 of user core. Sep 4 15:44:39.555334 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 4 15:44:39.611395 sudo[1914]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 4 15:44:39.611558 sudo[1914]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:44:39.620495 sudo[1914]: pam_unix(sudo:session): session closed for user root Sep 4 15:44:39.621152 sshd[1913]: Connection closed by 139.178.89.65 port 48748 Sep 4 15:44:39.621460 sshd-session[1910]: pam_unix(sshd:session): session closed for user core Sep 4 15:44:39.631775 systemd[1]: sshd@4-139.178.70.109:22-139.178.89.65:48748.service: Deactivated successfully. Sep 4 15:44:39.633144 systemd[1]: session-7.scope: Deactivated successfully. Sep 4 15:44:39.633874 systemd-logind[1608]: Session 7 logged out. Waiting for processes to exit. Sep 4 15:44:39.636127 systemd[1]: Started sshd@5-139.178.70.109:22-139.178.89.65:48754.service - OpenSSH per-connection server daemon (139.178.89.65:48754). Sep 4 15:44:39.636636 systemd-logind[1608]: Removed session 7. Sep 4 15:44:39.672003 sshd[1920]: Accepted publickey for core from 139.178.89.65 port 48754 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:44:39.672828 sshd-session[1920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:44:39.675455 systemd-logind[1608]: New session 8 of user core. Sep 4 15:44:39.685340 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 4 15:44:39.733483 sudo[1925]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 4 15:44:39.733652 sudo[1925]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:44:39.736510 sudo[1925]: pam_unix(sudo:session): session closed for user root Sep 4 15:44:39.739771 sudo[1924]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 4 15:44:39.739937 sudo[1924]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:44:39.746196 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 15:44:39.769682 augenrules[1947]: No rules Sep 4 15:44:39.770349 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 15:44:39.770550 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 15:44:39.771312 sudo[1924]: pam_unix(sudo:session): session closed for user root Sep 4 15:44:39.772192 sshd[1923]: Connection closed by 139.178.89.65 port 48754 Sep 4 15:44:39.772447 sshd-session[1920]: pam_unix(sshd:session): session closed for user core Sep 4 15:44:39.777733 systemd[1]: sshd@5-139.178.70.109:22-139.178.89.65:48754.service: Deactivated successfully. Sep 4 15:44:39.778825 systemd[1]: session-8.scope: Deactivated successfully. Sep 4 15:44:39.779543 systemd-logind[1608]: Session 8 logged out. Waiting for processes to exit. Sep 4 15:44:39.781522 systemd[1]: Started sshd@6-139.178.70.109:22-139.178.89.65:48756.service - OpenSSH per-connection server daemon (139.178.89.65:48756). Sep 4 15:44:39.782370 systemd-logind[1608]: Removed session 8. Sep 4 15:44:39.815127 sshd[1956]: Accepted publickey for core from 139.178.89.65 port 48756 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:44:39.815903 sshd-session[1956]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:44:39.818486 systemd-logind[1608]: New session 9 of user core. Sep 4 15:44:39.826336 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 4 15:44:39.875013 sudo[1960]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 4 15:44:39.876037 sudo[1960]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:44:40.346928 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 4 15:44:40.357466 (dockerd)[1978]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 4 15:44:40.917911 dockerd[1978]: time="2025-09-04T15:44:40.917870585Z" level=info msg="Starting up" Sep 4 15:44:40.918534 dockerd[1978]: time="2025-09-04T15:44:40.918520929Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 4 15:44:40.925943 dockerd[1978]: time="2025-09-04T15:44:40.925910176Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 4 15:44:41.141955 dockerd[1978]: time="2025-09-04T15:44:41.141914127Z" level=info msg="Loading containers: start." Sep 4 15:44:41.192241 kernel: Initializing XFRM netlink socket Sep 4 15:44:41.501670 systemd-networkd[1518]: docker0: Link UP Sep 4 15:44:41.503559 dockerd[1978]: time="2025-09-04T15:44:41.503506442Z" level=info msg="Loading containers: done." Sep 4 15:44:41.514243 dockerd[1978]: time="2025-09-04T15:44:41.514166228Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 4 15:44:41.514243 dockerd[1978]: time="2025-09-04T15:44:41.514243007Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 4 15:44:41.514355 dockerd[1978]: time="2025-09-04T15:44:41.514310402Z" level=info msg="Initializing buildkit" Sep 4 15:44:41.524162 dockerd[1978]: time="2025-09-04T15:44:41.524133031Z" level=info msg="Completed buildkit initialization" Sep 4 15:44:41.528447 dockerd[1978]: time="2025-09-04T15:44:41.528408419Z" level=info msg="Daemon has completed initialization" Sep 4 15:44:41.528653 dockerd[1978]: time="2025-09-04T15:44:41.528489382Z" level=info msg="API listen on /run/docker.sock" Sep 4 15:44:41.528794 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 4 15:44:43.193597 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 4 15:44:43.195322 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:44:43.919116 containerd[1628]: time="2025-09-04T15:44:43.919057326Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Sep 4 15:44:44.244190 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:44:44.249541 (kubelet)[2196]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:44:44.332548 kubelet[2196]: E0904 15:44:44.332512 2196 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:44:44.334132 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:44:44.334283 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:44:44.334671 systemd[1]: kubelet.service: Consumed 115ms CPU time, 110M memory peak. Sep 4 15:44:44.981243 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount415127600.mount: Deactivated successfully. Sep 4 15:44:46.635237 containerd[1628]: time="2025-09-04T15:44:46.634851076Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:46.644515 containerd[1628]: time="2025-09-04T15:44:46.644490517Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=30078664" Sep 4 15:44:46.650559 containerd[1628]: time="2025-09-04T15:44:46.650540838Z" level=info msg="ImageCreate event name:\"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:46.656233 containerd[1628]: time="2025-09-04T15:44:46.656039912Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:46.656446 containerd[1628]: time="2025-09-04T15:44:46.656434233Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"30075464\" in 2.737352145s" Sep 4 15:44:46.656497 containerd[1628]: time="2025-09-04T15:44:46.656489049Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\"" Sep 4 15:44:46.656912 containerd[1628]: time="2025-09-04T15:44:46.656900925Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Sep 4 15:44:48.482233 containerd[1628]: time="2025-09-04T15:44:48.481789108Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:48.488817 containerd[1628]: time="2025-09-04T15:44:48.488799071Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=26018066" Sep 4 15:44:48.494399 containerd[1628]: time="2025-09-04T15:44:48.494379739Z" level=info msg="ImageCreate event name:\"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:48.500269 containerd[1628]: time="2025-09-04T15:44:48.500251745Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:48.500939 containerd[1628]: time="2025-09-04T15:44:48.500852987Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"27646961\" in 1.843900001s" Sep 4 15:44:48.500939 containerd[1628]: time="2025-09-04T15:44:48.500874414Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\"" Sep 4 15:44:48.501528 containerd[1628]: time="2025-09-04T15:44:48.501484008Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Sep 4 15:44:49.962586 containerd[1628]: time="2025-09-04T15:44:49.962547411Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:49.967422 containerd[1628]: time="2025-09-04T15:44:49.967391301Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=20153911" Sep 4 15:44:49.975937 containerd[1628]: time="2025-09-04T15:44:49.975905990Z" level=info msg="ImageCreate event name:\"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:49.981324 containerd[1628]: time="2025-09-04T15:44:49.981292569Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:49.982021 containerd[1628]: time="2025-09-04T15:44:49.981929111Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"21782824\" in 1.480345297s" Sep 4 15:44:49.982021 containerd[1628]: time="2025-09-04T15:44:49.981953831Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\"" Sep 4 15:44:49.982459 containerd[1628]: time="2025-09-04T15:44:49.982293849Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Sep 4 15:44:51.236917 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1808202119.mount: Deactivated successfully. Sep 4 15:44:51.652756 containerd[1628]: time="2025-09-04T15:44:51.652651137Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:51.659167 containerd[1628]: time="2025-09-04T15:44:51.659146794Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=31899626" Sep 4 15:44:51.666768 containerd[1628]: time="2025-09-04T15:44:51.666734169Z" level=info msg="ImageCreate event name:\"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:51.676897 containerd[1628]: time="2025-09-04T15:44:51.676847195Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:51.677350 containerd[1628]: time="2025-09-04T15:44:51.677242235Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"31898645\" in 1.694930359s" Sep 4 15:44:51.677350 containerd[1628]: time="2025-09-04T15:44:51.677270930Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\"" Sep 4 15:44:51.677990 containerd[1628]: time="2025-09-04T15:44:51.677800892Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 4 15:44:52.337966 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount277324405.mount: Deactivated successfully. Sep 4 15:44:53.088148 containerd[1628]: time="2025-09-04T15:44:53.087336680Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:53.088148 containerd[1628]: time="2025-09-04T15:44:53.087806834Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Sep 4 15:44:53.088148 containerd[1628]: time="2025-09-04T15:44:53.088112066Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:53.089951 containerd[1628]: time="2025-09-04T15:44:53.089915086Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:53.090560 containerd[1628]: time="2025-09-04T15:44:53.090543884Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.412723427s" Sep 4 15:44:53.090602 containerd[1628]: time="2025-09-04T15:44:53.090563330Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 4 15:44:53.091030 containerd[1628]: time="2025-09-04T15:44:53.090997284Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 4 15:44:53.380611 update_engine[1610]: I20250904 15:44:53.380334 1610 update_attempter.cc:509] Updating boot flags... Sep 4 15:44:53.719324 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount89007397.mount: Deactivated successfully. Sep 4 15:44:53.785141 containerd[1628]: time="2025-09-04T15:44:53.784731303Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:44:53.792390 containerd[1628]: time="2025-09-04T15:44:53.792361121Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 4 15:44:53.801899 containerd[1628]: time="2025-09-04T15:44:53.801856528Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:44:53.808421 containerd[1628]: time="2025-09-04T15:44:53.808390312Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:44:53.808915 containerd[1628]: time="2025-09-04T15:44:53.808848646Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 717.766654ms" Sep 4 15:44:53.808915 containerd[1628]: time="2025-09-04T15:44:53.808868204Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 4 15:44:53.809374 containerd[1628]: time="2025-09-04T15:44:53.809356579Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 4 15:44:54.444055 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 4 15:44:54.445652 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:44:54.462479 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2984999266.mount: Deactivated successfully. Sep 4 15:44:55.153496 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:44:55.156244 (kubelet)[2366]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:44:55.484174 kubelet[2366]: E0904 15:44:55.484135 2366 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:44:55.486171 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:44:55.486392 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:44:55.486904 systemd[1]: kubelet.service: Consumed 113ms CPU time, 107.1M memory peak. Sep 4 15:44:57.746634 containerd[1628]: time="2025-09-04T15:44:57.746518104Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:57.747165 containerd[1628]: time="2025-09-04T15:44:57.747026079Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58377871" Sep 4 15:44:57.748509 containerd[1628]: time="2025-09-04T15:44:57.748489358Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:57.750241 containerd[1628]: time="2025-09-04T15:44:57.750226315Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:44:57.751278 containerd[1628]: time="2025-09-04T15:44:57.751052926Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 3.941676138s" Sep 4 15:44:57.751278 containerd[1628]: time="2025-09-04T15:44:57.751073158Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 4 15:45:00.131777 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:45:00.131892 systemd[1]: kubelet.service: Consumed 113ms CPU time, 107.1M memory peak. Sep 4 15:45:00.134487 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:45:00.156503 systemd[1]: Reload requested from client PID 2450 ('systemctl') (unit session-9.scope)... Sep 4 15:45:00.156515 systemd[1]: Reloading... Sep 4 15:45:00.223245 zram_generator::config[2493]: No configuration found. Sep 4 15:45:00.301555 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:45:00.369895 systemd[1]: Reloading finished in 213 ms. Sep 4 15:45:00.394550 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 4 15:45:00.394705 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 4 15:45:00.395055 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:45:00.396568 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:45:00.770990 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:45:00.776425 (kubelet)[2561]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 15:45:00.819230 kubelet[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:45:00.819230 kubelet[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 4 15:45:00.819230 kubelet[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:45:00.835530 kubelet[2561]: I0904 15:45:00.835461 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 15:45:01.316172 kubelet[2561]: I0904 15:45:01.315445 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 4 15:45:01.316172 kubelet[2561]: I0904 15:45:01.315467 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 15:45:01.316172 kubelet[2561]: I0904 15:45:01.315608 2561 server.go:956] "Client rotation is on, will bootstrap in background" Sep 4 15:45:01.350069 kubelet[2561]: I0904 15:45:01.349912 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 15:45:01.354791 kubelet[2561]: E0904 15:45:01.354764 2561 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://139.178.70.109:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 4 15:45:01.365849 kubelet[2561]: I0904 15:45:01.365826 2561 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 15:45:01.372463 kubelet[2561]: I0904 15:45:01.372267 2561 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 15:45:01.377424 kubelet[2561]: I0904 15:45:01.377373 2561 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 15:45:01.380057 kubelet[2561]: I0904 15:45:01.377541 2561 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 15:45:01.380389 kubelet[2561]: I0904 15:45:01.380228 2561 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 15:45:01.380389 kubelet[2561]: I0904 15:45:01.380242 2561 container_manager_linux.go:303] "Creating device plugin manager" Sep 4 15:45:01.381026 kubelet[2561]: I0904 15:45:01.381017 2561 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:45:01.383791 kubelet[2561]: I0904 15:45:01.383687 2561 kubelet.go:480] "Attempting to sync node with API server" Sep 4 15:45:01.383791 kubelet[2561]: I0904 15:45:01.383711 2561 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 15:45:01.383791 kubelet[2561]: I0904 15:45:01.383732 2561 kubelet.go:386] "Adding apiserver pod source" Sep 4 15:45:01.383791 kubelet[2561]: I0904 15:45:01.383742 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 15:45:01.389614 kubelet[2561]: E0904 15:45:01.389386 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 4 15:45:01.389614 kubelet[2561]: I0904 15:45:01.389482 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 15:45:01.391471 kubelet[2561]: I0904 15:45:01.391365 2561 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 4 15:45:01.392270 kubelet[2561]: W0904 15:45:01.392259 2561 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 4 15:45:01.400671 kubelet[2561]: E0904 15:45:01.400642 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.109:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 4 15:45:01.404275 kubelet[2561]: I0904 15:45:01.404247 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 4 15:45:01.404344 kubelet[2561]: I0904 15:45:01.404308 2561 server.go:1289] "Started kubelet" Sep 4 15:45:01.404603 kubelet[2561]: I0904 15:45:01.404568 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 15:45:01.405213 kubelet[2561]: I0904 15:45:01.405199 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 15:45:01.405313 kubelet[2561]: I0904 15:45:01.405288 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 15:45:01.406592 kubelet[2561]: I0904 15:45:01.406502 2561 server.go:317] "Adding debug handlers to kubelet server" Sep 4 15:45:01.411126 kubelet[2561]: I0904 15:45:01.410679 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 15:45:01.412248 kubelet[2561]: E0904 15:45:01.410316 2561 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.109:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.109:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18621ed4355053eb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-04 15:45:01.404271595 +0000 UTC m=+0.625750075,LastTimestamp:2025-09-04 15:45:01.404271595 +0000 UTC m=+0.625750075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 4 15:45:01.412601 kubelet[2561]: I0904 15:45:01.412511 2561 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 15:45:01.417416 kubelet[2561]: E0904 15:45:01.417393 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:01.417416 kubelet[2561]: I0904 15:45:01.417419 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 4 15:45:01.417835 kubelet[2561]: I0904 15:45:01.417535 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 4 15:45:01.417835 kubelet[2561]: I0904 15:45:01.417571 2561 reconciler.go:26] "Reconciler: start to sync state" Sep 4 15:45:01.417835 kubelet[2561]: E0904 15:45:01.417806 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 4 15:45:01.417996 kubelet[2561]: E0904 15:45:01.417976 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.109:6443: connect: connection refused" interval="200ms" Sep 4 15:45:01.418342 kubelet[2561]: I0904 15:45:01.418329 2561 factory.go:223] Registration of the systemd container factory successfully Sep 4 15:45:01.418486 kubelet[2561]: I0904 15:45:01.418378 2561 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 15:45:01.419409 kubelet[2561]: E0904 15:45:01.419352 2561 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 15:45:01.419447 kubelet[2561]: I0904 15:45:01.419432 2561 factory.go:223] Registration of the containerd container factory successfully Sep 4 15:45:01.428068 kubelet[2561]: I0904 15:45:01.428035 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 4 15:45:01.429803 kubelet[2561]: I0904 15:45:01.429788 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 4 15:45:01.429888 kubelet[2561]: I0904 15:45:01.429883 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 4 15:45:01.429933 kubelet[2561]: I0904 15:45:01.429928 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 4 15:45:01.430121 kubelet[2561]: I0904 15:45:01.429959 2561 kubelet.go:2436] "Starting kubelet main sync loop" Sep 4 15:45:01.430121 kubelet[2561]: E0904 15:45:01.429983 2561 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 15:45:01.437084 kubelet[2561]: E0904 15:45:01.437062 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 4 15:45:01.446016 kubelet[2561]: I0904 15:45:01.445790 2561 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 4 15:45:01.446016 kubelet[2561]: I0904 15:45:01.445803 2561 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 4 15:45:01.446016 kubelet[2561]: I0904 15:45:01.445812 2561 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:45:01.446872 kubelet[2561]: I0904 15:45:01.446865 2561 policy_none.go:49] "None policy: Start" Sep 4 15:45:01.446919 kubelet[2561]: I0904 15:45:01.446914 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 4 15:45:01.446953 kubelet[2561]: I0904 15:45:01.446948 2561 state_mem.go:35] "Initializing new in-memory state store" Sep 4 15:45:01.452403 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 4 15:45:01.460017 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 4 15:45:01.463259 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 4 15:45:01.482077 kubelet[2561]: E0904 15:45:01.482042 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 4 15:45:01.482601 kubelet[2561]: I0904 15:45:01.482490 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 15:45:01.482601 kubelet[2561]: I0904 15:45:01.482503 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 15:45:01.483071 kubelet[2561]: I0904 15:45:01.482750 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 15:45:01.483793 kubelet[2561]: E0904 15:45:01.483784 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 4 15:45:01.483901 kubelet[2561]: E0904 15:45:01.483875 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 4 15:45:01.545757 systemd[1]: Created slice kubepods-burstable-pod92f21450b3c3468ef38a61f981bbed1a.slice - libcontainer container kubepods-burstable-pod92f21450b3c3468ef38a61f981bbed1a.slice. Sep 4 15:45:01.554076 kubelet[2561]: E0904 15:45:01.554057 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:01.556822 systemd[1]: Created slice kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice - libcontainer container kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice. Sep 4 15:45:01.558464 kubelet[2561]: E0904 15:45:01.558447 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:01.560414 systemd[1]: Created slice kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice - libcontainer container kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice. Sep 4 15:45:01.562028 kubelet[2561]: E0904 15:45:01.562017 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:01.584098 kubelet[2561]: I0904 15:45:01.584036 2561 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:45:01.585131 kubelet[2561]: E0904 15:45:01.585117 2561 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.109:6443/api/v1/nodes\": dial tcp 139.178.70.109:6443: connect: connection refused" node="localhost" Sep 4 15:45:01.618567 kubelet[2561]: I0904 15:45:01.618544 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/92f21450b3c3468ef38a61f981bbed1a-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"92f21450b3c3468ef38a61f981bbed1a\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:01.618883 kubelet[2561]: I0904 15:45:01.618854 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/92f21450b3c3468ef38a61f981bbed1a-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"92f21450b3c3468ef38a61f981bbed1a\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:01.618883 kubelet[2561]: E0904 15:45:01.618829 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.109:6443: connect: connection refused" interval="400ms" Sep 4 15:45:01.618883 kubelet[2561]: I0904 15:45:01.618869 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:01.619015 kubelet[2561]: I0904 15:45:01.618892 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:01.619015 kubelet[2561]: I0904 15:45:01.618917 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:01.619015 kubelet[2561]: I0904 15:45:01.618931 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:01.619015 kubelet[2561]: I0904 15:45:01.618941 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:01.619015 kubelet[2561]: I0904 15:45:01.618958 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:01.619095 kubelet[2561]: I0904 15:45:01.618973 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/92f21450b3c3468ef38a61f981bbed1a-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"92f21450b3c3468ef38a61f981bbed1a\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:01.786401 kubelet[2561]: I0904 15:45:01.786371 2561 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:45:01.786662 kubelet[2561]: E0904 15:45:01.786642 2561 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.109:6443/api/v1/nodes\": dial tcp 139.178.70.109:6443: connect: connection refused" node="localhost" Sep 4 15:45:01.856592 containerd[1628]: time="2025-09-04T15:45:01.856512769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:92f21450b3c3468ef38a61f981bbed1a,Namespace:kube-system,Attempt:0,}" Sep 4 15:45:01.865239 containerd[1628]: time="2025-09-04T15:45:01.865113436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,}" Sep 4 15:45:01.865239 containerd[1628]: time="2025-09-04T15:45:01.865113339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,}" Sep 4 15:45:01.944933 containerd[1628]: time="2025-09-04T15:45:01.943662619Z" level=info msg="connecting to shim fff0d6d519fd0695e4b9fe9348de61c5d865fe4750b344277a5f5ddbe37358ad" address="unix:///run/containerd/s/633a8cdeec74e4e0de35244608b6f8c335d5250278de9e721b8799b9215fe397" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:45:01.946783 containerd[1628]: time="2025-09-04T15:45:01.946756909Z" level=info msg="connecting to shim b8ab93b770ade83cc6a6616f1238788c7a1c60891a509d3487b78dc9b6c86d5a" address="unix:///run/containerd/s/9641fce9e5690f9b5dcd0eca290e5f2030643c6cea69162f9d3f4f22af66bad1" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:45:01.949193 containerd[1628]: time="2025-09-04T15:45:01.949170405Z" level=info msg="connecting to shim ffc9b4a09f79bd8139668b49b03019e311e332c8614e09cf8a5783480754c61f" address="unix:///run/containerd/s/4ccb3459e5090c0ac6ab0c5bef31fc22439b346e5acbfb7f2ca1abb839098bc7" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:45:02.019999 kubelet[2561]: E0904 15:45:02.019971 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.109:6443: connect: connection refused" interval="800ms" Sep 4 15:45:02.188292 kubelet[2561]: I0904 15:45:02.188209 2561 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:45:02.188595 kubelet[2561]: E0904 15:45:02.188478 2561 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.109:6443/api/v1/nodes\": dial tcp 139.178.70.109:6443: connect: connection refused" node="localhost" Sep 4 15:45:02.246424 systemd[1]: Started cri-containerd-b8ab93b770ade83cc6a6616f1238788c7a1c60891a509d3487b78dc9b6c86d5a.scope - libcontainer container b8ab93b770ade83cc6a6616f1238788c7a1c60891a509d3487b78dc9b6c86d5a. Sep 4 15:45:02.248257 systemd[1]: Started cri-containerd-ffc9b4a09f79bd8139668b49b03019e311e332c8614e09cf8a5783480754c61f.scope - libcontainer container ffc9b4a09f79bd8139668b49b03019e311e332c8614e09cf8a5783480754c61f. Sep 4 15:45:02.250480 systemd[1]: Started cri-containerd-fff0d6d519fd0695e4b9fe9348de61c5d865fe4750b344277a5f5ddbe37358ad.scope - libcontainer container fff0d6d519fd0695e4b9fe9348de61c5d865fe4750b344277a5f5ddbe37358ad. Sep 4 15:45:02.389598 kubelet[2561]: E0904 15:45:02.389563 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.109:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 4 15:45:02.396423 containerd[1628]: time="2025-09-04T15:45:02.396365534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,} returns sandbox id \"b8ab93b770ade83cc6a6616f1238788c7a1c60891a509d3487b78dc9b6c86d5a\"" Sep 4 15:45:02.413663 containerd[1628]: time="2025-09-04T15:45:02.413629491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:92f21450b3c3468ef38a61f981bbed1a,Namespace:kube-system,Attempt:0,} returns sandbox id \"ffc9b4a09f79bd8139668b49b03019e311e332c8614e09cf8a5783480754c61f\"" Sep 4 15:45:02.420734 containerd[1628]: time="2025-09-04T15:45:02.420613579Z" level=info msg="CreateContainer within sandbox \"b8ab93b770ade83cc6a6616f1238788c7a1c60891a509d3487b78dc9b6c86d5a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 4 15:45:02.447348 containerd[1628]: time="2025-09-04T15:45:02.447321494Z" level=info msg="CreateContainer within sandbox \"ffc9b4a09f79bd8139668b49b03019e311e332c8614e09cf8a5783480754c61f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 4 15:45:02.449692 containerd[1628]: time="2025-09-04T15:45:02.449636397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,} returns sandbox id \"fff0d6d519fd0695e4b9fe9348de61c5d865fe4750b344277a5f5ddbe37358ad\"" Sep 4 15:45:02.482993 containerd[1628]: time="2025-09-04T15:45:02.482955494Z" level=info msg="CreateContainer within sandbox \"fff0d6d519fd0695e4b9fe9348de61c5d865fe4750b344277a5f5ddbe37358ad\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 4 15:45:02.593618 containerd[1628]: time="2025-09-04T15:45:02.593578616Z" level=info msg="Container 384b8f57c621998a9224b07d37ff41a7af20d7ca190c8a10fc824ae9dbce00d7: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:02.617144 containerd[1628]: time="2025-09-04T15:45:02.617121925Z" level=info msg="Container 6f163710296b490a31b5f5ca4127e13eb2ee614adb0667a3ac7708b1674c42b5: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:02.676639 containerd[1628]: time="2025-09-04T15:45:02.676578381Z" level=info msg="Container 741187828c1876584c7db9bfe52cdfde6ce24d8458c51bc39eccb034a86694a9: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:02.724205 containerd[1628]: time="2025-09-04T15:45:02.724128257Z" level=info msg="CreateContainer within sandbox \"ffc9b4a09f79bd8139668b49b03019e311e332c8614e09cf8a5783480754c61f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"6f163710296b490a31b5f5ca4127e13eb2ee614adb0667a3ac7708b1674c42b5\"" Sep 4 15:45:02.724922 containerd[1628]: time="2025-09-04T15:45:02.724905576Z" level=info msg="StartContainer for \"6f163710296b490a31b5f5ca4127e13eb2ee614adb0667a3ac7708b1674c42b5\"" Sep 4 15:45:02.725664 containerd[1628]: time="2025-09-04T15:45:02.725648085Z" level=info msg="connecting to shim 6f163710296b490a31b5f5ca4127e13eb2ee614adb0667a3ac7708b1674c42b5" address="unix:///run/containerd/s/4ccb3459e5090c0ac6ab0c5bef31fc22439b346e5acbfb7f2ca1abb839098bc7" protocol=ttrpc version=3 Sep 4 15:45:02.727794 containerd[1628]: time="2025-09-04T15:45:02.727715385Z" level=info msg="CreateContainer within sandbox \"b8ab93b770ade83cc6a6616f1238788c7a1c60891a509d3487b78dc9b6c86d5a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"384b8f57c621998a9224b07d37ff41a7af20d7ca190c8a10fc824ae9dbce00d7\"" Sep 4 15:45:02.728187 containerd[1628]: time="2025-09-04T15:45:02.728177552Z" level=info msg="StartContainer for \"384b8f57c621998a9224b07d37ff41a7af20d7ca190c8a10fc824ae9dbce00d7\"" Sep 4 15:45:02.728978 containerd[1628]: time="2025-09-04T15:45:02.728755425Z" level=info msg="connecting to shim 384b8f57c621998a9224b07d37ff41a7af20d7ca190c8a10fc824ae9dbce00d7" address="unix:///run/containerd/s/9641fce9e5690f9b5dcd0eca290e5f2030643c6cea69162f9d3f4f22af66bad1" protocol=ttrpc version=3 Sep 4 15:45:02.730921 containerd[1628]: time="2025-09-04T15:45:02.730903037Z" level=info msg="CreateContainer within sandbox \"fff0d6d519fd0695e4b9fe9348de61c5d865fe4750b344277a5f5ddbe37358ad\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"741187828c1876584c7db9bfe52cdfde6ce24d8458c51bc39eccb034a86694a9\"" Sep 4 15:45:02.732539 containerd[1628]: time="2025-09-04T15:45:02.732524623Z" level=info msg="StartContainer for \"741187828c1876584c7db9bfe52cdfde6ce24d8458c51bc39eccb034a86694a9\"" Sep 4 15:45:02.733150 containerd[1628]: time="2025-09-04T15:45:02.733135506Z" level=info msg="connecting to shim 741187828c1876584c7db9bfe52cdfde6ce24d8458c51bc39eccb034a86694a9" address="unix:///run/containerd/s/633a8cdeec74e4e0de35244608b6f8c335d5250278de9e721b8799b9215fe397" protocol=ttrpc version=3 Sep 4 15:45:02.747379 systemd[1]: Started cri-containerd-6f163710296b490a31b5f5ca4127e13eb2ee614adb0667a3ac7708b1674c42b5.scope - libcontainer container 6f163710296b490a31b5f5ca4127e13eb2ee614adb0667a3ac7708b1674c42b5. Sep 4 15:45:02.752328 systemd[1]: Started cri-containerd-384b8f57c621998a9224b07d37ff41a7af20d7ca190c8a10fc824ae9dbce00d7.scope - libcontainer container 384b8f57c621998a9224b07d37ff41a7af20d7ca190c8a10fc824ae9dbce00d7. Sep 4 15:45:02.753650 systemd[1]: Started cri-containerd-741187828c1876584c7db9bfe52cdfde6ce24d8458c51bc39eccb034a86694a9.scope - libcontainer container 741187828c1876584c7db9bfe52cdfde6ce24d8458c51bc39eccb034a86694a9. Sep 4 15:45:02.807995 containerd[1628]: time="2025-09-04T15:45:02.807906630Z" level=info msg="StartContainer for \"384b8f57c621998a9224b07d37ff41a7af20d7ca190c8a10fc824ae9dbce00d7\" returns successfully" Sep 4 15:45:02.815243 containerd[1628]: time="2025-09-04T15:45:02.815180084Z" level=info msg="StartContainer for \"6f163710296b490a31b5f5ca4127e13eb2ee614adb0667a3ac7708b1674c42b5\" returns successfully" Sep 4 15:45:02.821572 kubelet[2561]: E0904 15:45:02.821531 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.109:6443: connect: connection refused" interval="1.6s" Sep 4 15:45:02.823729 containerd[1628]: time="2025-09-04T15:45:02.823711607Z" level=info msg="StartContainer for \"741187828c1876584c7db9bfe52cdfde6ce24d8458c51bc39eccb034a86694a9\" returns successfully" Sep 4 15:45:02.880540 kubelet[2561]: E0904 15:45:02.880516 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 4 15:45:02.881700 kubelet[2561]: E0904 15:45:02.881683 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 4 15:45:02.975645 kubelet[2561]: E0904 15:45:02.975571 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 4 15:45:02.989638 kubelet[2561]: I0904 15:45:02.989608 2561 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:45:02.990514 kubelet[2561]: E0904 15:45:02.990493 2561 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.109:6443/api/v1/nodes\": dial tcp 139.178.70.109:6443: connect: connection refused" node="localhost" Sep 4 15:45:03.449479 kubelet[2561]: E0904 15:45:03.449380 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:03.449863 kubelet[2561]: E0904 15:45:03.449747 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:03.454104 kubelet[2561]: E0904 15:45:03.453126 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:04.441630 kubelet[2561]: E0904 15:45:04.441608 2561 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 4 15:45:04.453451 kubelet[2561]: E0904 15:45:04.453432 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:04.453664 kubelet[2561]: E0904 15:45:04.453620 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:04.592096 kubelet[2561]: I0904 15:45:04.592068 2561 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:45:04.749757 kubelet[2561]: I0904 15:45:04.749728 2561 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 4 15:45:04.749757 kubelet[2561]: E0904 15:45:04.749758 2561 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 4 15:45:04.781676 kubelet[2561]: E0904 15:45:04.781647 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:04.791900 kubelet[2561]: E0904 15:45:04.791639 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:04.882505 kubelet[2561]: E0904 15:45:04.882479 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:04.983305 kubelet[2561]: E0904 15:45:04.983269 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:05.084145 kubelet[2561]: E0904 15:45:05.083891 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:05.184407 kubelet[2561]: E0904 15:45:05.184378 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:05.284900 kubelet[2561]: E0904 15:45:05.284864 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:05.385707 kubelet[2561]: E0904 15:45:05.385456 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:05.454197 kubelet[2561]: E0904 15:45:05.454059 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:05.455758 kubelet[2561]: E0904 15:45:05.455654 2561 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:45:05.485947 kubelet[2561]: E0904 15:45:05.485921 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:45:05.618389 kubelet[2561]: I0904 15:45:05.618286 2561 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:05.625227 kubelet[2561]: I0904 15:45:05.625184 2561 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:05.629100 kubelet[2561]: I0904 15:45:05.628969 2561 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:06.357166 systemd[1]: Reload requested from client PID 2830 ('systemctl') (unit session-9.scope)... Sep 4 15:45:06.357413 systemd[1]: Reloading... Sep 4 15:45:06.398639 kubelet[2561]: I0904 15:45:06.398615 2561 apiserver.go:52] "Watching apiserver" Sep 4 15:45:06.412244 zram_generator::config[2873]: No configuration found. Sep 4 15:45:06.418039 kubelet[2561]: I0904 15:45:06.418006 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 4 15:45:06.454529 kubelet[2561]: I0904 15:45:06.454386 2561 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:06.454854 kubelet[2561]: I0904 15:45:06.454784 2561 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:06.473502 kubelet[2561]: E0904 15:45:06.473368 2561 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:06.473502 kubelet[2561]: E0904 15:45:06.473428 2561 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:06.499294 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 4 15:45:06.576638 systemd[1]: Reloading finished in 218 ms. Sep 4 15:45:06.611211 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:45:06.623587 systemd[1]: kubelet.service: Deactivated successfully. Sep 4 15:45:06.623880 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:45:06.623990 systemd[1]: kubelet.service: Consumed 778ms CPU time, 128.6M memory peak. Sep 4 15:45:06.626411 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:45:06.908940 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:45:06.916626 (kubelet)[2941]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 15:45:07.036562 kubelet[2941]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:45:07.036562 kubelet[2941]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 4 15:45:07.036562 kubelet[2941]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:45:07.036562 kubelet[2941]: I0904 15:45:07.036291 2941 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 15:45:07.040032 kubelet[2941]: I0904 15:45:07.040019 2941 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 4 15:45:07.040115 kubelet[2941]: I0904 15:45:07.040109 2941 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 15:45:07.040262 kubelet[2941]: I0904 15:45:07.040255 2941 server.go:956] "Client rotation is on, will bootstrap in background" Sep 4 15:45:07.057331 kubelet[2941]: I0904 15:45:07.057307 2941 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 4 15:45:07.070516 kubelet[2941]: I0904 15:45:07.070492 2941 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 15:45:07.078871 kubelet[2941]: I0904 15:45:07.078839 2941 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 15:45:07.080604 kubelet[2941]: I0904 15:45:07.080574 2941 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 15:45:07.086832 kubelet[2941]: I0904 15:45:07.086794 2941 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 15:45:07.086926 kubelet[2941]: I0904 15:45:07.086824 2941 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 15:45:07.086926 kubelet[2941]: I0904 15:45:07.086923 2941 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 15:45:07.087045 kubelet[2941]: I0904 15:45:07.086930 2941 container_manager_linux.go:303] "Creating device plugin manager" Sep 4 15:45:07.087045 kubelet[2941]: I0904 15:45:07.086974 2941 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:45:07.087112 kubelet[2941]: I0904 15:45:07.087104 2941 kubelet.go:480] "Attempting to sync node with API server" Sep 4 15:45:07.087142 kubelet[2941]: I0904 15:45:07.087113 2941 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 15:45:07.087142 kubelet[2941]: I0904 15:45:07.087127 2941 kubelet.go:386] "Adding apiserver pod source" Sep 4 15:45:07.093269 kubelet[2941]: I0904 15:45:07.092968 2941 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 15:45:07.114670 kubelet[2941]: I0904 15:45:07.114646 2941 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 15:45:07.121842 kubelet[2941]: I0904 15:45:07.121817 2941 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 4 15:45:07.123928 kubelet[2941]: I0904 15:45:07.123908 2941 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 4 15:45:07.124253 kubelet[2941]: I0904 15:45:07.123938 2941 server.go:1289] "Started kubelet" Sep 4 15:45:07.124253 kubelet[2941]: I0904 15:45:07.124123 2941 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 15:45:07.124253 kubelet[2941]: I0904 15:45:07.124095 2941 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 15:45:07.124344 kubelet[2941]: I0904 15:45:07.124284 2941 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 15:45:07.126936 kubelet[2941]: I0904 15:45:07.125466 2941 server.go:317] "Adding debug handlers to kubelet server" Sep 4 15:45:07.126936 kubelet[2941]: I0904 15:45:07.126512 2941 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 15:45:07.127055 kubelet[2941]: I0904 15:45:07.127025 2941 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 15:45:07.136020 kubelet[2941]: I0904 15:45:07.135998 2941 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 4 15:45:07.136101 kubelet[2941]: I0904 15:45:07.136068 2941 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 4 15:45:07.136322 kubelet[2941]: I0904 15:45:07.136167 2941 reconciler.go:26] "Reconciler: start to sync state" Sep 4 15:45:07.137569 kubelet[2941]: I0904 15:45:07.137549 2941 factory.go:223] Registration of the systemd container factory successfully Sep 4 15:45:07.137727 kubelet[2941]: I0904 15:45:07.137706 2941 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 15:45:07.139189 kubelet[2941]: E0904 15:45:07.138835 2941 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 15:45:07.139189 kubelet[2941]: I0904 15:45:07.139076 2941 factory.go:223] Registration of the containerd container factory successfully Sep 4 15:45:07.144273 kubelet[2941]: I0904 15:45:07.144247 2941 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 4 15:45:07.150043 kubelet[2941]: I0904 15:45:07.150025 2941 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 4 15:45:07.150137 kubelet[2941]: I0904 15:45:07.150131 2941 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 4 15:45:07.150188 kubelet[2941]: I0904 15:45:07.150183 2941 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 4 15:45:07.150412 kubelet[2941]: I0904 15:45:07.150238 2941 kubelet.go:2436] "Starting kubelet main sync loop" Sep 4 15:45:07.150412 kubelet[2941]: E0904 15:45:07.150266 2941 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 15:45:07.181310 kubelet[2941]: I0904 15:45:07.181254 2941 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 4 15:45:07.181409 kubelet[2941]: I0904 15:45:07.181398 2941 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 4 15:45:07.181460 kubelet[2941]: I0904 15:45:07.181455 2941 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:45:07.181567 kubelet[2941]: I0904 15:45:07.181560 2941 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 4 15:45:07.181608 kubelet[2941]: I0904 15:45:07.181597 2941 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 4 15:45:07.181638 kubelet[2941]: I0904 15:45:07.181634 2941 policy_none.go:49] "None policy: Start" Sep 4 15:45:07.181667 kubelet[2941]: I0904 15:45:07.181663 2941 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 4 15:45:07.181701 kubelet[2941]: I0904 15:45:07.181697 2941 state_mem.go:35] "Initializing new in-memory state store" Sep 4 15:45:07.181781 kubelet[2941]: I0904 15:45:07.181775 2941 state_mem.go:75] "Updated machine memory state" Sep 4 15:45:07.184520 kubelet[2941]: E0904 15:45:07.184507 2941 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 4 15:45:07.185016 kubelet[2941]: I0904 15:45:07.184991 2941 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 15:45:07.185016 kubelet[2941]: I0904 15:45:07.185003 2941 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 15:45:07.185567 kubelet[2941]: I0904 15:45:07.185554 2941 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 15:45:07.186352 kubelet[2941]: E0904 15:45:07.186338 2941 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 4 15:45:07.250912 kubelet[2941]: I0904 15:45:07.250754 2941 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:07.250912 kubelet[2941]: I0904 15:45:07.250828 2941 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:07.250912 kubelet[2941]: I0904 15:45:07.250755 2941 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:07.253895 kubelet[2941]: E0904 15:45:07.253874 2941 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:07.254183 kubelet[2941]: E0904 15:45:07.254157 2941 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:07.254462 kubelet[2941]: E0904 15:45:07.254447 2941 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:07.287376 kubelet[2941]: I0904 15:45:07.287362 2941 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:45:07.306814 kubelet[2941]: I0904 15:45:07.306757 2941 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 4 15:45:07.306988 kubelet[2941]: I0904 15:45:07.306923 2941 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 4 15:45:07.437493 kubelet[2941]: I0904 15:45:07.437311 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:07.437493 kubelet[2941]: I0904 15:45:07.437338 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:07.437493 kubelet[2941]: I0904 15:45:07.437351 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/92f21450b3c3468ef38a61f981bbed1a-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"92f21450b3c3468ef38a61f981bbed1a\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:07.437493 kubelet[2941]: I0904 15:45:07.437363 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:07.437493 kubelet[2941]: I0904 15:45:07.437372 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:07.437633 kubelet[2941]: I0904 15:45:07.437384 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:07.437633 kubelet[2941]: I0904 15:45:07.437402 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/92f21450b3c3468ef38a61f981bbed1a-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"92f21450b3c3468ef38a61f981bbed1a\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:07.437633 kubelet[2941]: I0904 15:45:07.437414 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/92f21450b3c3468ef38a61f981bbed1a-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"92f21450b3c3468ef38a61f981bbed1a\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:07.437633 kubelet[2941]: I0904 15:45:07.437430 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:45:08.110250 kubelet[2941]: I0904 15:45:08.110169 2941 apiserver.go:52] "Watching apiserver" Sep 4 15:45:08.136347 kubelet[2941]: I0904 15:45:08.136323 2941 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 4 15:45:08.168852 kubelet[2941]: I0904 15:45:08.167682 2941 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:08.169254 kubelet[2941]: I0904 15:45:08.169193 2941 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:08.173481 kubelet[2941]: E0904 15:45:08.173449 2941 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 4 15:45:08.175418 kubelet[2941]: E0904 15:45:08.175388 2941 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 4 15:45:08.184006 kubelet[2941]: I0904 15:45:08.183967 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=3.183943192 podStartE2EDuration="3.183943192s" podCreationTimestamp="2025-09-04 15:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:45:08.183232694 +0000 UTC m=+1.212924413" watchObservedRunningTime="2025-09-04 15:45:08.183943192 +0000 UTC m=+1.213634911" Sep 4 15:45:08.189860 kubelet[2941]: I0904 15:45:08.189740 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.189716245 podStartE2EDuration="3.189716245s" podCreationTimestamp="2025-09-04 15:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:45:08.18954378 +0000 UTC m=+1.219235509" watchObservedRunningTime="2025-09-04 15:45:08.189716245 +0000 UTC m=+1.219407967" Sep 4 15:45:08.196141 kubelet[2941]: I0904 15:45:08.195886 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.1958745 podStartE2EDuration="3.1958745s" podCreationTimestamp="2025-09-04 15:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:45:08.195497021 +0000 UTC m=+1.225188749" watchObservedRunningTime="2025-09-04 15:45:08.1958745 +0000 UTC m=+1.225566229" Sep 4 15:45:12.821962 kubelet[2941]: I0904 15:45:12.821859 2941 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 4 15:45:12.823070 containerd[1628]: time="2025-09-04T15:45:12.822836094Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 4 15:45:12.823312 kubelet[2941]: I0904 15:45:12.822952 2941 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 4 15:45:13.876537 systemd[1]: Created slice kubepods-besteffort-pod02d8787f_5fde_4a78_b0ad_31f7e1f4f8c5.slice - libcontainer container kubepods-besteffort-pod02d8787f_5fde_4a78_b0ad_31f7e1f4f8c5.slice. Sep 4 15:45:13.880697 kubelet[2941]: I0904 15:45:13.880673 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5-kube-proxy\") pod \"kube-proxy-6l5xw\" (UID: \"02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5\") " pod="kube-system/kube-proxy-6l5xw" Sep 4 15:45:13.881076 kubelet[2941]: I0904 15:45:13.880698 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5-lib-modules\") pod \"kube-proxy-6l5xw\" (UID: \"02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5\") " pod="kube-system/kube-proxy-6l5xw" Sep 4 15:45:13.881076 kubelet[2941]: I0904 15:45:13.880723 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5-xtables-lock\") pod \"kube-proxy-6l5xw\" (UID: \"02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5\") " pod="kube-system/kube-proxy-6l5xw" Sep 4 15:45:13.881076 kubelet[2941]: I0904 15:45:13.880733 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7v9s\" (UniqueName: \"kubernetes.io/projected/02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5-kube-api-access-t7v9s\") pod \"kube-proxy-6l5xw\" (UID: \"02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5\") " pod="kube-system/kube-proxy-6l5xw" Sep 4 15:45:14.026829 systemd[1]: Created slice kubepods-besteffort-pod2f85c1ba_e571_49e8_9fee_04be8c431e4d.slice - libcontainer container kubepods-besteffort-pod2f85c1ba_e571_49e8_9fee_04be8c431e4d.slice. Sep 4 15:45:14.081854 kubelet[2941]: I0904 15:45:14.081794 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2f85c1ba-e571-49e8-9fee-04be8c431e4d-var-lib-calico\") pod \"tigera-operator-755d956888-w4k24\" (UID: \"2f85c1ba-e571-49e8-9fee-04be8c431e4d\") " pod="tigera-operator/tigera-operator-755d956888-w4k24" Sep 4 15:45:14.081854 kubelet[2941]: I0904 15:45:14.081824 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgzh2\" (UniqueName: \"kubernetes.io/projected/2f85c1ba-e571-49e8-9fee-04be8c431e4d-kube-api-access-qgzh2\") pod \"tigera-operator-755d956888-w4k24\" (UID: \"2f85c1ba-e571-49e8-9fee-04be8c431e4d\") " pod="tigera-operator/tigera-operator-755d956888-w4k24" Sep 4 15:45:14.183818 containerd[1628]: time="2025-09-04T15:45:14.183746390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6l5xw,Uid:02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5,Namespace:kube-system,Attempt:0,}" Sep 4 15:45:14.201494 containerd[1628]: time="2025-09-04T15:45:14.200483729Z" level=info msg="connecting to shim eaff391a3d7b6f1f0cc23085dc782c89ec70e70e224ed1ceb1bf62fd4d19aec2" address="unix:///run/containerd/s/74a078146d06b397534474597cd47487f289e854cd512c1935aaf25164b5af82" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:45:14.221386 systemd[1]: Started cri-containerd-eaff391a3d7b6f1f0cc23085dc782c89ec70e70e224ed1ceb1bf62fd4d19aec2.scope - libcontainer container eaff391a3d7b6f1f0cc23085dc782c89ec70e70e224ed1ceb1bf62fd4d19aec2. Sep 4 15:45:14.237494 containerd[1628]: time="2025-09-04T15:45:14.237457333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6l5xw,Uid:02d8787f-5fde-4a78-b0ad-31f7e1f4f8c5,Namespace:kube-system,Attempt:0,} returns sandbox id \"eaff391a3d7b6f1f0cc23085dc782c89ec70e70e224ed1ceb1bf62fd4d19aec2\"" Sep 4 15:45:14.240741 containerd[1628]: time="2025-09-04T15:45:14.240722773Z" level=info msg="CreateContainer within sandbox \"eaff391a3d7b6f1f0cc23085dc782c89ec70e70e224ed1ceb1bf62fd4d19aec2\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 4 15:45:14.247019 containerd[1628]: time="2025-09-04T15:45:14.246960313Z" level=info msg="Container 945e705a04eb46149145c7407f8fca27d986e76e76966c2d0f72573385045217: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:14.252739 containerd[1628]: time="2025-09-04T15:45:14.252718148Z" level=info msg="CreateContainer within sandbox \"eaff391a3d7b6f1f0cc23085dc782c89ec70e70e224ed1ceb1bf62fd4d19aec2\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"945e705a04eb46149145c7407f8fca27d986e76e76966c2d0f72573385045217\"" Sep 4 15:45:14.253352 containerd[1628]: time="2025-09-04T15:45:14.253335124Z" level=info msg="StartContainer for \"945e705a04eb46149145c7407f8fca27d986e76e76966c2d0f72573385045217\"" Sep 4 15:45:14.255036 containerd[1628]: time="2025-09-04T15:45:14.255006743Z" level=info msg="connecting to shim 945e705a04eb46149145c7407f8fca27d986e76e76966c2d0f72573385045217" address="unix:///run/containerd/s/74a078146d06b397534474597cd47487f289e854cd512c1935aaf25164b5af82" protocol=ttrpc version=3 Sep 4 15:45:14.268314 systemd[1]: Started cri-containerd-945e705a04eb46149145c7407f8fca27d986e76e76966c2d0f72573385045217.scope - libcontainer container 945e705a04eb46149145c7407f8fca27d986e76e76966c2d0f72573385045217. Sep 4 15:45:14.289782 containerd[1628]: time="2025-09-04T15:45:14.289748243Z" level=info msg="StartContainer for \"945e705a04eb46149145c7407f8fca27d986e76e76966c2d0f72573385045217\" returns successfully" Sep 4 15:45:14.331669 containerd[1628]: time="2025-09-04T15:45:14.331641866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-w4k24,Uid:2f85c1ba-e571-49e8-9fee-04be8c431e4d,Namespace:tigera-operator,Attempt:0,}" Sep 4 15:45:14.342551 containerd[1628]: time="2025-09-04T15:45:14.342518488Z" level=info msg="connecting to shim 3d9eb6c93585a4349917fd747d0f63ca4ad6fa9d648fc4129abda3613c60af6f" address="unix:///run/containerd/s/bc5186d8ca98e9a3fbb6f82b8d9ed100cba323b87ee5a9745285ba051df9e1ac" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:45:14.359301 systemd[1]: Started cri-containerd-3d9eb6c93585a4349917fd747d0f63ca4ad6fa9d648fc4129abda3613c60af6f.scope - libcontainer container 3d9eb6c93585a4349917fd747d0f63ca4ad6fa9d648fc4129abda3613c60af6f. Sep 4 15:45:14.395450 containerd[1628]: time="2025-09-04T15:45:14.395383315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-w4k24,Uid:2f85c1ba-e571-49e8-9fee-04be8c431e4d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3d9eb6c93585a4349917fd747d0f63ca4ad6fa9d648fc4129abda3613c60af6f\"" Sep 4 15:45:14.396572 containerd[1628]: time="2025-09-04T15:45:14.396556614Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 4 15:45:14.990376 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount435787429.mount: Deactivated successfully. Sep 4 15:45:15.641514 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1708040542.mount: Deactivated successfully. Sep 4 15:45:16.068939 containerd[1628]: time="2025-09-04T15:45:16.068552075Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:16.072171 containerd[1628]: time="2025-09-04T15:45:16.072142694Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 4 15:45:16.081240 containerd[1628]: time="2025-09-04T15:45:16.081061361Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:16.082038 containerd[1628]: time="2025-09-04T15:45:16.082020253Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:16.082484 containerd[1628]: time="2025-09-04T15:45:16.082470388Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.685897221s" Sep 4 15:45:16.082577 containerd[1628]: time="2025-09-04T15:45:16.082528130Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 4 15:45:16.085319 containerd[1628]: time="2025-09-04T15:45:16.085294080Z" level=info msg="CreateContainer within sandbox \"3d9eb6c93585a4349917fd747d0f63ca4ad6fa9d648fc4129abda3613c60af6f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 4 15:45:16.091235 containerd[1628]: time="2025-09-04T15:45:16.091197044Z" level=info msg="Container c5756decd3d9b7eae883f60f94a904127dc706ab0ae7028546f5a58db2f4ae13: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:16.093627 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3505843722.mount: Deactivated successfully. Sep 4 15:45:16.095910 containerd[1628]: time="2025-09-04T15:45:16.095888104Z" level=info msg="CreateContainer within sandbox \"3d9eb6c93585a4349917fd747d0f63ca4ad6fa9d648fc4129abda3613c60af6f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c5756decd3d9b7eae883f60f94a904127dc706ab0ae7028546f5a58db2f4ae13\"" Sep 4 15:45:16.096952 containerd[1628]: time="2025-09-04T15:45:16.096288380Z" level=info msg="StartContainer for \"c5756decd3d9b7eae883f60f94a904127dc706ab0ae7028546f5a58db2f4ae13\"" Sep 4 15:45:16.096952 containerd[1628]: time="2025-09-04T15:45:16.096762420Z" level=info msg="connecting to shim c5756decd3d9b7eae883f60f94a904127dc706ab0ae7028546f5a58db2f4ae13" address="unix:///run/containerd/s/bc5186d8ca98e9a3fbb6f82b8d9ed100cba323b87ee5a9745285ba051df9e1ac" protocol=ttrpc version=3 Sep 4 15:45:16.117380 systemd[1]: Started cri-containerd-c5756decd3d9b7eae883f60f94a904127dc706ab0ae7028546f5a58db2f4ae13.scope - libcontainer container c5756decd3d9b7eae883f60f94a904127dc706ab0ae7028546f5a58db2f4ae13. Sep 4 15:45:16.137613 containerd[1628]: time="2025-09-04T15:45:16.137281514Z" level=info msg="StartContainer for \"c5756decd3d9b7eae883f60f94a904127dc706ab0ae7028546f5a58db2f4ae13\" returns successfully" Sep 4 15:45:16.188663 kubelet[2941]: I0904 15:45:16.188623 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-6l5xw" podStartSLOduration=3.188610354 podStartE2EDuration="3.188610354s" podCreationTimestamp="2025-09-04 15:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:45:15.181926731 +0000 UTC m=+8.211618454" watchObservedRunningTime="2025-09-04 15:45:16.188610354 +0000 UTC m=+9.218302078" Sep 4 15:45:17.163847 kubelet[2941]: I0904 15:45:17.163792 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-w4k24" podStartSLOduration=1.4770863269999999 podStartE2EDuration="3.16377239s" podCreationTimestamp="2025-09-04 15:45:14 +0000 UTC" firstStartedPulling="2025-09-04 15:45:14.396357662 +0000 UTC m=+7.426049378" lastFinishedPulling="2025-09-04 15:45:16.083043725 +0000 UTC m=+9.112735441" observedRunningTime="2025-09-04 15:45:16.18952877 +0000 UTC m=+9.219220501" watchObservedRunningTime="2025-09-04 15:45:17.16377239 +0000 UTC m=+10.193464114" Sep 4 15:45:21.203623 sudo[1960]: pam_unix(sudo:session): session closed for user root Sep 4 15:45:21.206142 sshd[1959]: Connection closed by 139.178.89.65 port 48756 Sep 4 15:45:21.207037 sshd-session[1956]: pam_unix(sshd:session): session closed for user core Sep 4 15:45:21.210761 systemd[1]: sshd@6-139.178.70.109:22-139.178.89.65:48756.service: Deactivated successfully. Sep 4 15:45:21.213365 systemd[1]: session-9.scope: Deactivated successfully. Sep 4 15:45:21.213885 systemd[1]: session-9.scope: Consumed 3.296s CPU time, 157.6M memory peak. Sep 4 15:45:21.217590 systemd-logind[1608]: Session 9 logged out. Waiting for processes to exit. Sep 4 15:45:21.219832 systemd-logind[1608]: Removed session 9. Sep 4 15:45:23.364814 systemd[1]: Created slice kubepods-besteffort-podb9ed3e6e_370b_4816_9263_ec36d958340b.slice - libcontainer container kubepods-besteffort-podb9ed3e6e_370b_4816_9263_ec36d958340b.slice. Sep 4 15:45:23.440200 kubelet[2941]: I0904 15:45:23.440160 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b9ed3e6e-370b-4816-9263-ec36d958340b-typha-certs\") pod \"calico-typha-6f9658bc94-jtjps\" (UID: \"b9ed3e6e-370b-4816-9263-ec36d958340b\") " pod="calico-system/calico-typha-6f9658bc94-jtjps" Sep 4 15:45:23.440642 kubelet[2941]: I0904 15:45:23.440566 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2t45\" (UniqueName: \"kubernetes.io/projected/b9ed3e6e-370b-4816-9263-ec36d958340b-kube-api-access-r2t45\") pod \"calico-typha-6f9658bc94-jtjps\" (UID: \"b9ed3e6e-370b-4816-9263-ec36d958340b\") " pod="calico-system/calico-typha-6f9658bc94-jtjps" Sep 4 15:45:23.440642 kubelet[2941]: I0904 15:45:23.440598 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9ed3e6e-370b-4816-9263-ec36d958340b-tigera-ca-bundle\") pod \"calico-typha-6f9658bc94-jtjps\" (UID: \"b9ed3e6e-370b-4816-9263-ec36d958340b\") " pod="calico-system/calico-typha-6f9658bc94-jtjps" Sep 4 15:45:23.667582 systemd[1]: Created slice kubepods-besteffort-pod8dfcb5bb_b020_40ee_9eb0_9fbeb825bff8.slice - libcontainer container kubepods-besteffort-pod8dfcb5bb_b020_40ee_9eb0_9fbeb825bff8.slice. Sep 4 15:45:23.672674 containerd[1628]: time="2025-09-04T15:45:23.672636160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f9658bc94-jtjps,Uid:b9ed3e6e-370b-4816-9263-ec36d958340b,Namespace:calico-system,Attempt:0,}" Sep 4 15:45:23.716487 containerd[1628]: time="2025-09-04T15:45:23.716433253Z" level=info msg="connecting to shim 41b46bafc2cc4059bbc6b1f7dd824396b4370fcaf327b70205aabf76f853deb7" address="unix:///run/containerd/s/08c3d40073a5f36f7c58707573680b9c116e043b0444850715ab81a6c90417a0" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:45:23.739393 systemd[1]: Started cri-containerd-41b46bafc2cc4059bbc6b1f7dd824396b4370fcaf327b70205aabf76f853deb7.scope - libcontainer container 41b46bafc2cc4059bbc6b1f7dd824396b4370fcaf327b70205aabf76f853deb7. Sep 4 15:45:23.743006 kubelet[2941]: I0904 15:45:23.742955 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-xtables-lock\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.743006 kubelet[2941]: I0904 15:45:23.742999 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-lib-modules\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.743123 kubelet[2941]: I0904 15:45:23.743022 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-flexvol-driver-host\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.743123 kubelet[2941]: I0904 15:45:23.743035 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-tigera-ca-bundle\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.743123 kubelet[2941]: I0904 15:45:23.743059 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-var-run-calico\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.743123 kubelet[2941]: I0904 15:45:23.743073 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-var-lib-calico\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.743123 kubelet[2941]: I0904 15:45:23.743088 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-node-certs\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.744013 kubelet[2941]: I0904 15:45:23.743099 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-cni-log-dir\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.744013 kubelet[2941]: I0904 15:45:23.743107 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-policysync\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.744013 kubelet[2941]: I0904 15:45:23.743118 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8gq5\" (UniqueName: \"kubernetes.io/projected/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-kube-api-access-g8gq5\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.744013 kubelet[2941]: I0904 15:45:23.743145 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-cni-net-dir\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.744013 kubelet[2941]: I0904 15:45:23.743157 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8-cni-bin-dir\") pod \"calico-node-6mb89\" (UID: \"8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8\") " pod="calico-system/calico-node-6mb89" Sep 4 15:45:23.796146 containerd[1628]: time="2025-09-04T15:45:23.795666342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f9658bc94-jtjps,Uid:b9ed3e6e-370b-4816-9263-ec36d958340b,Namespace:calico-system,Attempt:0,} returns sandbox id \"41b46bafc2cc4059bbc6b1f7dd824396b4370fcaf327b70205aabf76f853deb7\"" Sep 4 15:45:23.796707 containerd[1628]: time="2025-09-04T15:45:23.796676558Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 4 15:45:23.851231 kubelet[2941]: E0904 15:45:23.845534 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851231 kubelet[2941]: W0904 15:45:23.845551 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851231 kubelet[2941]: E0904 15:45:23.845583 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851231 kubelet[2941]: E0904 15:45:23.845699 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851231 kubelet[2941]: W0904 15:45:23.845706 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851231 kubelet[2941]: E0904 15:45:23.845713 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851231 kubelet[2941]: E0904 15:45:23.845831 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851231 kubelet[2941]: W0904 15:45:23.845837 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851231 kubelet[2941]: E0904 15:45:23.845843 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851231 kubelet[2941]: E0904 15:45:23.845993 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851549 kubelet[2941]: W0904 15:45:23.845999 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851549 kubelet[2941]: E0904 15:45:23.846006 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851549 kubelet[2941]: E0904 15:45:23.846448 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851549 kubelet[2941]: W0904 15:45:23.846455 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851549 kubelet[2941]: E0904 15:45:23.846462 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851549 kubelet[2941]: E0904 15:45:23.846565 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851549 kubelet[2941]: W0904 15:45:23.846571 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851549 kubelet[2941]: E0904 15:45:23.846603 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851549 kubelet[2941]: E0904 15:45:23.846707 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851549 kubelet[2941]: W0904 15:45:23.846713 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851757 kubelet[2941]: E0904 15:45:23.846720 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851757 kubelet[2941]: E0904 15:45:23.846871 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851757 kubelet[2941]: W0904 15:45:23.846877 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851757 kubelet[2941]: E0904 15:45:23.846884 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851757 kubelet[2941]: E0904 15:45:23.847046 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851757 kubelet[2941]: W0904 15:45:23.847053 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851757 kubelet[2941]: E0904 15:45:23.847061 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851757 kubelet[2941]: E0904 15:45:23.847191 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851757 kubelet[2941]: W0904 15:45:23.847197 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851757 kubelet[2941]: E0904 15:45:23.847203 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851963 kubelet[2941]: E0904 15:45:23.847365 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851963 kubelet[2941]: W0904 15:45:23.847945 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851963 kubelet[2941]: E0904 15:45:23.848109 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851963 kubelet[2941]: E0904 15:45:23.848599 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851963 kubelet[2941]: W0904 15:45:23.848611 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851963 kubelet[2941]: E0904 15:45:23.848724 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851963 kubelet[2941]: E0904 15:45:23.849129 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.851963 kubelet[2941]: W0904 15:45:23.849141 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.851963 kubelet[2941]: E0904 15:45:23.849149 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.851963 kubelet[2941]: E0904 15:45:23.849469 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.852158 kubelet[2941]: W0904 15:45:23.849476 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.852158 kubelet[2941]: E0904 15:45:23.849579 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.852158 kubelet[2941]: E0904 15:45:23.849909 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.852158 kubelet[2941]: W0904 15:45:23.849915 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.852158 kubelet[2941]: E0904 15:45:23.849923 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.852158 kubelet[2941]: E0904 15:45:23.850261 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.852158 kubelet[2941]: W0904 15:45:23.850268 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.852158 kubelet[2941]: E0904 15:45:23.850275 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.852158 kubelet[2941]: E0904 15:45:23.850577 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.852158 kubelet[2941]: W0904 15:45:23.850585 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.852525 kubelet[2941]: E0904 15:45:23.850595 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.852525 kubelet[2941]: E0904 15:45:23.850985 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.852525 kubelet[2941]: W0904 15:45:23.850992 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.852525 kubelet[2941]: E0904 15:45:23.850998 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.852525 kubelet[2941]: E0904 15:45:23.851256 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.852525 kubelet[2941]: W0904 15:45:23.851262 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.852525 kubelet[2941]: E0904 15:45:23.851269 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.852525 kubelet[2941]: E0904 15:45:23.851805 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.852525 kubelet[2941]: W0904 15:45:23.851813 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.852525 kubelet[2941]: E0904 15:45:23.851823 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.855309 kubelet[2941]: E0904 15:45:23.855023 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.855309 kubelet[2941]: W0904 15:45:23.855047 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.855309 kubelet[2941]: E0904 15:45:23.855059 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.859434 kubelet[2941]: E0904 15:45:23.859402 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:23.859434 kubelet[2941]: W0904 15:45:23.859416 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:23.859434 kubelet[2941]: E0904 15:45:23.859429 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:23.945523 kubelet[2941]: E0904 15:45:23.945493 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:23.971499 containerd[1628]: time="2025-09-04T15:45:23.971469618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6mb89,Uid:8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8,Namespace:calico-system,Attempt:0,}" Sep 4 15:45:24.038115 containerd[1628]: time="2025-09-04T15:45:24.037929419Z" level=info msg="connecting to shim cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f" address="unix:///run/containerd/s/fec43038c72f9d26840a5188372090ae1ff44ffe0750d581493fd4320da66e76" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:45:24.038212 kubelet[2941]: E0904 15:45:24.038010 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.038212 kubelet[2941]: W0904 15:45:24.038028 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.038212 kubelet[2941]: E0904 15:45:24.038046 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.038602 kubelet[2941]: E0904 15:45:24.038440 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.038602 kubelet[2941]: W0904 15:45:24.038449 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.038602 kubelet[2941]: E0904 15:45:24.038458 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.038854 kubelet[2941]: E0904 15:45:24.038664 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.038854 kubelet[2941]: W0904 15:45:24.038671 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.038854 kubelet[2941]: E0904 15:45:24.038678 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.039778 kubelet[2941]: E0904 15:45:24.039694 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.039778 kubelet[2941]: W0904 15:45:24.039706 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.039778 kubelet[2941]: E0904 15:45:24.039715 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.040059 kubelet[2941]: E0904 15:45:24.039927 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.040059 kubelet[2941]: W0904 15:45:24.039935 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.040059 kubelet[2941]: E0904 15:45:24.039942 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.041525 kubelet[2941]: E0904 15:45:24.041153 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.041525 kubelet[2941]: W0904 15:45:24.041164 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.041525 kubelet[2941]: E0904 15:45:24.041174 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.041525 kubelet[2941]: E0904 15:45:24.041376 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.041525 kubelet[2941]: W0904 15:45:24.041383 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.041525 kubelet[2941]: E0904 15:45:24.041391 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.042082 kubelet[2941]: E0904 15:45:24.041921 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.042082 kubelet[2941]: W0904 15:45:24.041929 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.042082 kubelet[2941]: E0904 15:45:24.041936 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.042444 kubelet[2941]: E0904 15:45:24.042376 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.042444 kubelet[2941]: W0904 15:45:24.042384 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.042444 kubelet[2941]: E0904 15:45:24.042393 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.042802 kubelet[2941]: E0904 15:45:24.042648 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.042802 kubelet[2941]: W0904 15:45:24.042658 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.042802 kubelet[2941]: E0904 15:45:24.042667 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.043304 kubelet[2941]: E0904 15:45:24.043063 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.043304 kubelet[2941]: W0904 15:45:24.043071 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.043304 kubelet[2941]: E0904 15:45:24.043081 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.043793 kubelet[2941]: E0904 15:45:24.043600 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.043793 kubelet[2941]: W0904 15:45:24.043610 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.043793 kubelet[2941]: E0904 15:45:24.043620 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.044097 kubelet[2941]: E0904 15:45:24.044009 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.044305 kubelet[2941]: W0904 15:45:24.044143 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.044305 kubelet[2941]: E0904 15:45:24.044156 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.044920 kubelet[2941]: E0904 15:45:24.044758 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.044920 kubelet[2941]: W0904 15:45:24.044771 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.044920 kubelet[2941]: E0904 15:45:24.044781 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.045297 kubelet[2941]: E0904 15:45:24.045286 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.045509 kubelet[2941]: W0904 15:45:24.045354 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.045509 kubelet[2941]: E0904 15:45:24.045366 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.049612 kubelet[2941]: E0904 15:45:24.048070 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.049612 kubelet[2941]: W0904 15:45:24.048093 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.049612 kubelet[2941]: E0904 15:45:24.048113 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.049612 kubelet[2941]: E0904 15:45:24.048603 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.049612 kubelet[2941]: W0904 15:45:24.048611 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.049612 kubelet[2941]: E0904 15:45:24.048621 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.049612 kubelet[2941]: E0904 15:45:24.048735 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.049612 kubelet[2941]: W0904 15:45:24.048740 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.049612 kubelet[2941]: E0904 15:45:24.048747 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.049612 kubelet[2941]: E0904 15:45:24.048834 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.050051 kubelet[2941]: W0904 15:45:24.048840 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.050051 kubelet[2941]: E0904 15:45:24.048846 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.050051 kubelet[2941]: E0904 15:45:24.049007 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.050051 kubelet[2941]: W0904 15:45:24.049013 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.050051 kubelet[2941]: E0904 15:45:24.049021 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.050051 kubelet[2941]: E0904 15:45:24.049404 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.050051 kubelet[2941]: W0904 15:45:24.049410 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.050051 kubelet[2941]: E0904 15:45:24.049418 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.050051 kubelet[2941]: I0904 15:45:24.049446 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fe34a63-17a4-4039-80b1-075eaa32bbb7-kubelet-dir\") pod \"csi-node-driver-48dj6\" (UID: \"6fe34a63-17a4-4039-80b1-075eaa32bbb7\") " pod="calico-system/csi-node-driver-48dj6" Sep 4 15:45:24.050853 kubelet[2941]: E0904 15:45:24.050773 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.050853 kubelet[2941]: W0904 15:45:24.050789 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.050853 kubelet[2941]: E0904 15:45:24.050803 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.050853 kubelet[2941]: I0904 15:45:24.050826 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6fe34a63-17a4-4039-80b1-075eaa32bbb7-registration-dir\") pod \"csi-node-driver-48dj6\" (UID: \"6fe34a63-17a4-4039-80b1-075eaa32bbb7\") " pod="calico-system/csi-node-driver-48dj6" Sep 4 15:45:24.051242 kubelet[2941]: E0904 15:45:24.051013 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.051242 kubelet[2941]: W0904 15:45:24.051025 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.051242 kubelet[2941]: E0904 15:45:24.051037 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.051338 kubelet[2941]: E0904 15:45:24.051326 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.051338 kubelet[2941]: W0904 15:45:24.051335 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.051500 kubelet[2941]: E0904 15:45:24.051343 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.051599 kubelet[2941]: E0904 15:45:24.051584 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.051599 kubelet[2941]: W0904 15:45:24.051594 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.052228 kubelet[2941]: E0904 15:45:24.051603 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.052228 kubelet[2941]: I0904 15:45:24.051626 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6fe34a63-17a4-4039-80b1-075eaa32bbb7-socket-dir\") pod \"csi-node-driver-48dj6\" (UID: \"6fe34a63-17a4-4039-80b1-075eaa32bbb7\") " pod="calico-system/csi-node-driver-48dj6" Sep 4 15:45:24.052549 kubelet[2941]: E0904 15:45:24.052530 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.052549 kubelet[2941]: W0904 15:45:24.052542 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.052744 kubelet[2941]: E0904 15:45:24.052550 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.052744 kubelet[2941]: I0904 15:45:24.052695 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6fe34a63-17a4-4039-80b1-075eaa32bbb7-varrun\") pod \"csi-node-driver-48dj6\" (UID: \"6fe34a63-17a4-4039-80b1-075eaa32bbb7\") " pod="calico-system/csi-node-driver-48dj6" Sep 4 15:45:24.052914 kubelet[2941]: E0904 15:45:24.052764 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.052914 kubelet[2941]: W0904 15:45:24.052771 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.052914 kubelet[2941]: E0904 15:45:24.052779 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.053552 kubelet[2941]: E0904 15:45:24.053461 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.053552 kubelet[2941]: W0904 15:45:24.053471 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.053552 kubelet[2941]: E0904 15:45:24.053478 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.053797 kubelet[2941]: E0904 15:45:24.053565 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.053797 kubelet[2941]: W0904 15:45:24.053573 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.053797 kubelet[2941]: E0904 15:45:24.053578 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.053797 kubelet[2941]: I0904 15:45:24.053597 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjfmv\" (UniqueName: \"kubernetes.io/projected/6fe34a63-17a4-4039-80b1-075eaa32bbb7-kube-api-access-jjfmv\") pod \"csi-node-driver-48dj6\" (UID: \"6fe34a63-17a4-4039-80b1-075eaa32bbb7\") " pod="calico-system/csi-node-driver-48dj6" Sep 4 15:45:24.054486 kubelet[2941]: E0904 15:45:24.054475 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.054716 kubelet[2941]: W0904 15:45:24.054634 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.054716 kubelet[2941]: E0904 15:45:24.054649 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.055437 kubelet[2941]: E0904 15:45:24.055403 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.055437 kubelet[2941]: W0904 15:45:24.055413 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.055437 kubelet[2941]: E0904 15:45:24.055422 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.058032 kubelet[2941]: E0904 15:45:24.057369 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.058032 kubelet[2941]: W0904 15:45:24.057479 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.058032 kubelet[2941]: E0904 15:45:24.057507 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.058032 kubelet[2941]: E0904 15:45:24.057689 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.058032 kubelet[2941]: W0904 15:45:24.057695 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.058032 kubelet[2941]: E0904 15:45:24.057701 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.058032 kubelet[2941]: E0904 15:45:24.057948 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.058032 kubelet[2941]: W0904 15:45:24.057955 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.058032 kubelet[2941]: E0904 15:45:24.057964 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.058473 kubelet[2941]: E0904 15:45:24.058437 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.058473 kubelet[2941]: W0904 15:45:24.058449 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.058473 kubelet[2941]: E0904 15:45:24.058457 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.085401 systemd[1]: Started cri-containerd-cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f.scope - libcontainer container cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f. Sep 4 15:45:24.152183 containerd[1628]: time="2025-09-04T15:45:24.152123333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6mb89,Uid:8dfcb5bb-b020-40ee-9eb0-9fbeb825bff8,Namespace:calico-system,Attempt:0,} returns sandbox id \"cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f\"" Sep 4 15:45:24.159548 kubelet[2941]: E0904 15:45:24.159530 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.159548 kubelet[2941]: W0904 15:45:24.159549 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.159831 kubelet[2941]: E0904 15:45:24.159562 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.159831 kubelet[2941]: E0904 15:45:24.159665 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.159831 kubelet[2941]: W0904 15:45:24.159670 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.159831 kubelet[2941]: E0904 15:45:24.159675 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.160006 kubelet[2941]: E0904 15:45:24.159933 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.160006 kubelet[2941]: W0904 15:45:24.159941 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.160006 kubelet[2941]: E0904 15:45:24.159949 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.160183 kubelet[2941]: E0904 15:45:24.160054 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.160183 kubelet[2941]: W0904 15:45:24.160113 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.160183 kubelet[2941]: E0904 15:45:24.160119 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.160351 kubelet[2941]: E0904 15:45:24.160257 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.160351 kubelet[2941]: W0904 15:45:24.160261 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.160351 kubelet[2941]: E0904 15:45:24.160266 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.160514 kubelet[2941]: E0904 15:45:24.160479 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.160514 kubelet[2941]: W0904 15:45:24.160485 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.160514 kubelet[2941]: E0904 15:45:24.160490 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.160699 kubelet[2941]: E0904 15:45:24.160660 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.160699 kubelet[2941]: W0904 15:45:24.160667 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.160699 kubelet[2941]: E0904 15:45:24.160672 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.168877 kubelet[2941]: E0904 15:45:24.160771 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.168877 kubelet[2941]: W0904 15:45:24.160777 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.168877 kubelet[2941]: E0904 15:45:24.160782 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.168877 kubelet[2941]: E0904 15:45:24.160861 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.168877 kubelet[2941]: W0904 15:45:24.160866 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.168877 kubelet[2941]: E0904 15:45:24.160870 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.168877 kubelet[2941]: E0904 15:45:24.160978 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.168877 kubelet[2941]: W0904 15:45:24.160982 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.168877 kubelet[2941]: E0904 15:45:24.160987 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.168877 kubelet[2941]: E0904 15:45:24.161094 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169162 kubelet[2941]: W0904 15:45:24.161100 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169162 kubelet[2941]: E0904 15:45:24.161105 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169162 kubelet[2941]: E0904 15:45:24.161363 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169162 kubelet[2941]: W0904 15:45:24.161368 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169162 kubelet[2941]: E0904 15:45:24.161374 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169162 kubelet[2941]: E0904 15:45:24.161669 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169162 kubelet[2941]: W0904 15:45:24.161675 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169162 kubelet[2941]: E0904 15:45:24.161681 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169162 kubelet[2941]: E0904 15:45:24.162092 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169162 kubelet[2941]: W0904 15:45:24.162097 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169387 kubelet[2941]: E0904 15:45:24.162103 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169387 kubelet[2941]: E0904 15:45:24.162498 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169387 kubelet[2941]: W0904 15:45:24.162504 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169387 kubelet[2941]: E0904 15:45:24.162510 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169387 kubelet[2941]: E0904 15:45:24.162769 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169387 kubelet[2941]: W0904 15:45:24.162780 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169387 kubelet[2941]: E0904 15:45:24.162787 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169387 kubelet[2941]: E0904 15:45:24.163051 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169387 kubelet[2941]: W0904 15:45:24.163057 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169387 kubelet[2941]: E0904 15:45:24.163063 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169572 kubelet[2941]: E0904 15:45:24.163418 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169572 kubelet[2941]: W0904 15:45:24.163424 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169572 kubelet[2941]: E0904 15:45:24.163430 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169572 kubelet[2941]: E0904 15:45:24.163538 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169572 kubelet[2941]: W0904 15:45:24.163543 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169572 kubelet[2941]: E0904 15:45:24.163548 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169572 kubelet[2941]: E0904 15:45:24.163894 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169572 kubelet[2941]: W0904 15:45:24.163899 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169572 kubelet[2941]: E0904 15:45:24.163904 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169572 kubelet[2941]: E0904 15:45:24.164242 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169763 kubelet[2941]: W0904 15:45:24.164247 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169763 kubelet[2941]: E0904 15:45:24.164253 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169763 kubelet[2941]: E0904 15:45:24.164785 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169763 kubelet[2941]: W0904 15:45:24.164791 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169763 kubelet[2941]: E0904 15:45:24.164798 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169763 kubelet[2941]: E0904 15:45:24.165950 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169763 kubelet[2941]: W0904 15:45:24.165957 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169763 kubelet[2941]: E0904 15:45:24.165965 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169763 kubelet[2941]: E0904 15:45:24.166086 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169763 kubelet[2941]: W0904 15:45:24.166094 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169941 kubelet[2941]: E0904 15:45:24.166106 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.169941 kubelet[2941]: E0904 15:45:24.166341 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.169941 kubelet[2941]: W0904 15:45:24.166346 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.169941 kubelet[2941]: E0904 15:45:24.166352 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:24.171404 kubelet[2941]: E0904 15:45:24.171386 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:24.171404 kubelet[2941]: W0904 15:45:24.171401 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:24.171464 kubelet[2941]: E0904 15:45:24.171415 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:26.150843 kubelet[2941]: E0904 15:45:26.150799 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:27.515844 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3944827887.mount: Deactivated successfully. Sep 4 15:45:28.151612 kubelet[2941]: E0904 15:45:28.151331 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:30.001040 containerd[1628]: time="2025-09-04T15:45:30.001003200Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:30.008274 containerd[1628]: time="2025-09-04T15:45:30.008250487Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 4 15:45:30.029229 containerd[1628]: time="2025-09-04T15:45:30.029176070Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:30.032166 containerd[1628]: time="2025-09-04T15:45:30.032136218Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:30.032982 containerd[1628]: time="2025-09-04T15:45:30.032958765Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 6.236235988s" Sep 4 15:45:30.033021 containerd[1628]: time="2025-09-04T15:45:30.032982398Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 4 15:45:30.033751 containerd[1628]: time="2025-09-04T15:45:30.033737493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 4 15:45:30.047442 containerd[1628]: time="2025-09-04T15:45:30.047415099Z" level=info msg="CreateContainer within sandbox \"41b46bafc2cc4059bbc6b1f7dd824396b4370fcaf327b70205aabf76f853deb7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 4 15:45:30.052498 containerd[1628]: time="2025-09-04T15:45:30.052443144Z" level=info msg="Container 7982b8682f21a55a7ab7d915d8028d3083e072c6203e9497cf5012bcadf5b288: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:30.056559 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount773644809.mount: Deactivated successfully. Sep 4 15:45:30.085163 containerd[1628]: time="2025-09-04T15:45:30.085131030Z" level=info msg="CreateContainer within sandbox \"41b46bafc2cc4059bbc6b1f7dd824396b4370fcaf327b70205aabf76f853deb7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7982b8682f21a55a7ab7d915d8028d3083e072c6203e9497cf5012bcadf5b288\"" Sep 4 15:45:30.086688 containerd[1628]: time="2025-09-04T15:45:30.085593956Z" level=info msg="StartContainer for \"7982b8682f21a55a7ab7d915d8028d3083e072c6203e9497cf5012bcadf5b288\"" Sep 4 15:45:30.086688 containerd[1628]: time="2025-09-04T15:45:30.086265690Z" level=info msg="connecting to shim 7982b8682f21a55a7ab7d915d8028d3083e072c6203e9497cf5012bcadf5b288" address="unix:///run/containerd/s/08c3d40073a5f36f7c58707573680b9c116e043b0444850715ab81a6c90417a0" protocol=ttrpc version=3 Sep 4 15:45:30.104359 systemd[1]: Started cri-containerd-7982b8682f21a55a7ab7d915d8028d3083e072c6203e9497cf5012bcadf5b288.scope - libcontainer container 7982b8682f21a55a7ab7d915d8028d3083e072c6203e9497cf5012bcadf5b288. Sep 4 15:45:30.150806 kubelet[2941]: E0904 15:45:30.150780 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:30.227254 containerd[1628]: time="2025-09-04T15:45:30.227230389Z" level=info msg="StartContainer for \"7982b8682f21a55a7ab7d915d8028d3083e072c6203e9497cf5012bcadf5b288\" returns successfully" Sep 4 15:45:31.218806 kubelet[2941]: I0904 15:45:31.218604 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6f9658bc94-jtjps" podStartSLOduration=1.981580353 podStartE2EDuration="8.218595186s" podCreationTimestamp="2025-09-04 15:45:23 +0000 UTC" firstStartedPulling="2025-09-04 15:45:23.796527808 +0000 UTC m=+16.826219525" lastFinishedPulling="2025-09-04 15:45:30.033542639 +0000 UTC m=+23.063234358" observedRunningTime="2025-09-04 15:45:31.218166866 +0000 UTC m=+24.247858589" watchObservedRunningTime="2025-09-04 15:45:31.218595186 +0000 UTC m=+24.248286908" Sep 4 15:45:31.294756 kubelet[2941]: E0904 15:45:31.294695 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.294756 kubelet[2941]: W0904 15:45:31.294710 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.294756 kubelet[2941]: E0904 15:45:31.294724 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.295000 kubelet[2941]: E0904 15:45:31.294968 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.295000 kubelet[2941]: W0904 15:45:31.294975 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.295000 kubelet[2941]: E0904 15:45:31.294981 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.295171 kubelet[2941]: E0904 15:45:31.295140 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.295171 kubelet[2941]: W0904 15:45:31.295146 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.295171 kubelet[2941]: E0904 15:45:31.295151 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.295401 kubelet[2941]: E0904 15:45:31.295366 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.295401 kubelet[2941]: W0904 15:45:31.295372 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.295401 kubelet[2941]: E0904 15:45:31.295377 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.295531 kubelet[2941]: E0904 15:45:31.295526 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.295589 kubelet[2941]: W0904 15:45:31.295563 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.295589 kubelet[2941]: E0904 15:45:31.295571 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.295724 kubelet[2941]: E0904 15:45:31.295696 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.295724 kubelet[2941]: W0904 15:45:31.295702 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.295724 kubelet[2941]: E0904 15:45:31.295707 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.295874 kubelet[2941]: E0904 15:45:31.295845 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.295874 kubelet[2941]: W0904 15:45:31.295850 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.295874 kubelet[2941]: E0904 15:45:31.295856 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.296032 kubelet[2941]: E0904 15:45:31.296000 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.296032 kubelet[2941]: W0904 15:45:31.296005 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.296032 kubelet[2941]: E0904 15:45:31.296010 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.296181 kubelet[2941]: E0904 15:45:31.296155 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.296181 kubelet[2941]: W0904 15:45:31.296160 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.296181 kubelet[2941]: E0904 15:45:31.296165 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.300596 kubelet[2941]: E0904 15:45:31.296420 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.300596 kubelet[2941]: W0904 15:45:31.296426 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.300596 kubelet[2941]: E0904 15:45:31.296431 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.300596 kubelet[2941]: E0904 15:45:31.296519 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.300596 kubelet[2941]: W0904 15:45:31.296524 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.300596 kubelet[2941]: E0904 15:45:31.296529 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.300596 kubelet[2941]: E0904 15:45:31.296606 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.300596 kubelet[2941]: W0904 15:45:31.296611 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.300596 kubelet[2941]: E0904 15:45:31.296615 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.300596 kubelet[2941]: E0904 15:45:31.296700 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.300763 kubelet[2941]: W0904 15:45:31.296705 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.300763 kubelet[2941]: E0904 15:45:31.296710 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.300763 kubelet[2941]: E0904 15:45:31.296792 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.300763 kubelet[2941]: W0904 15:45:31.296797 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.300763 kubelet[2941]: E0904 15:45:31.296801 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.300763 kubelet[2941]: E0904 15:45:31.296879 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.300763 kubelet[2941]: W0904 15:45:31.296883 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.300763 kubelet[2941]: E0904 15:45:31.296887 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.313999 kubelet[2941]: E0904 15:45:31.313870 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.313999 kubelet[2941]: W0904 15:45:31.313882 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.313999 kubelet[2941]: E0904 15:45:31.313890 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.314435 kubelet[2941]: E0904 15:45:31.314352 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.314435 kubelet[2941]: W0904 15:45:31.314370 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.314435 kubelet[2941]: E0904 15:45:31.314378 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.314504 kubelet[2941]: E0904 15:45:31.314495 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.314504 kubelet[2941]: W0904 15:45:31.314502 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.314645 kubelet[2941]: E0904 15:45:31.314510 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.314645 kubelet[2941]: E0904 15:45:31.314586 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.314645 kubelet[2941]: W0904 15:45:31.314590 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.314645 kubelet[2941]: E0904 15:45:31.314596 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.315036 kubelet[2941]: E0904 15:45:31.314663 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.315036 kubelet[2941]: W0904 15:45:31.314667 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.315036 kubelet[2941]: E0904 15:45:31.314672 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.315036 kubelet[2941]: E0904 15:45:31.314782 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.315036 kubelet[2941]: W0904 15:45:31.314787 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.315036 kubelet[2941]: E0904 15:45:31.314792 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.315555 kubelet[2941]: E0904 15:45:31.315545 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.315625 kubelet[2941]: W0904 15:45:31.315609 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.315688 kubelet[2941]: E0904 15:45:31.315677 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.316031 kubelet[2941]: E0904 15:45:31.315982 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.316149 kubelet[2941]: W0904 15:45:31.316076 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.316149 kubelet[2941]: E0904 15:45:31.316095 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.316301 kubelet[2941]: E0904 15:45:31.316266 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.316301 kubelet[2941]: W0904 15:45:31.316273 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.316301 kubelet[2941]: E0904 15:45:31.316278 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.316467 kubelet[2941]: E0904 15:45:31.316433 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.316467 kubelet[2941]: W0904 15:45:31.316439 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.316467 kubelet[2941]: E0904 15:45:31.316445 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.318008 kubelet[2941]: E0904 15:45:31.316655 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.318008 kubelet[2941]: W0904 15:45:31.316660 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.318008 kubelet[2941]: E0904 15:45:31.316665 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.318008 kubelet[2941]: E0904 15:45:31.316749 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.318008 kubelet[2941]: W0904 15:45:31.316754 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.318008 kubelet[2941]: E0904 15:45:31.316759 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.318008 kubelet[2941]: E0904 15:45:31.316860 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.318008 kubelet[2941]: W0904 15:45:31.316865 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.318008 kubelet[2941]: E0904 15:45:31.316870 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.318008 kubelet[2941]: E0904 15:45:31.317030 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.318165 kubelet[2941]: W0904 15:45:31.317034 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.318165 kubelet[2941]: E0904 15:45:31.317039 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.318165 kubelet[2941]: E0904 15:45:31.317118 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.318165 kubelet[2941]: W0904 15:45:31.317123 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.318165 kubelet[2941]: E0904 15:45:31.317127 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.318165 kubelet[2941]: E0904 15:45:31.317234 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.318165 kubelet[2941]: W0904 15:45:31.317239 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.318165 kubelet[2941]: E0904 15:45:31.317244 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.318165 kubelet[2941]: E0904 15:45:31.317392 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.318165 kubelet[2941]: W0904 15:45:31.317396 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.318373 kubelet[2941]: E0904 15:45:31.317401 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:31.318373 kubelet[2941]: E0904 15:45:31.317485 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:31.318373 kubelet[2941]: W0904 15:45:31.317490 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:31.318373 kubelet[2941]: E0904 15:45:31.317494 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.151232 kubelet[2941]: E0904 15:45:32.151131 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:32.213344 kubelet[2941]: I0904 15:45:32.213258 2941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 15:45:32.302952 kubelet[2941]: E0904 15:45:32.302923 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.302952 kubelet[2941]: W0904 15:45:32.302942 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.302952 kubelet[2941]: E0904 15:45:32.302958 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.303574 kubelet[2941]: E0904 15:45:32.303094 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.303574 kubelet[2941]: W0904 15:45:32.303103 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.303574 kubelet[2941]: E0904 15:45:32.303113 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.303574 kubelet[2941]: E0904 15:45:32.303254 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.303574 kubelet[2941]: W0904 15:45:32.303276 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.303574 kubelet[2941]: E0904 15:45:32.303288 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.303574 kubelet[2941]: E0904 15:45:32.303427 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.303574 kubelet[2941]: W0904 15:45:32.303445 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.303574 kubelet[2941]: E0904 15:45:32.303455 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.303574 kubelet[2941]: E0904 15:45:32.303562 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.303962 kubelet[2941]: W0904 15:45:32.303568 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.303962 kubelet[2941]: E0904 15:45:32.303575 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.303962 kubelet[2941]: E0904 15:45:32.303666 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.303962 kubelet[2941]: W0904 15:45:32.303683 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.303962 kubelet[2941]: E0904 15:45:32.303689 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.303962 kubelet[2941]: E0904 15:45:32.303790 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.303962 kubelet[2941]: W0904 15:45:32.303798 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.303962 kubelet[2941]: E0904 15:45:32.303807 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.303962 kubelet[2941]: E0904 15:45:32.303934 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.303962 kubelet[2941]: W0904 15:45:32.303942 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.304326 kubelet[2941]: E0904 15:45:32.303951 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.304326 kubelet[2941]: E0904 15:45:32.304098 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.304326 kubelet[2941]: W0904 15:45:32.304106 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.304326 kubelet[2941]: E0904 15:45:32.304114 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.304326 kubelet[2941]: E0904 15:45:32.304234 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.304326 kubelet[2941]: W0904 15:45:32.304242 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.304326 kubelet[2941]: E0904 15:45:32.304251 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.304580 kubelet[2941]: E0904 15:45:32.304362 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.304580 kubelet[2941]: W0904 15:45:32.304370 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.304580 kubelet[2941]: E0904 15:45:32.304379 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.304580 kubelet[2941]: E0904 15:45:32.304502 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.304580 kubelet[2941]: W0904 15:45:32.304508 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.304580 kubelet[2941]: E0904 15:45:32.304514 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.304827 kubelet[2941]: E0904 15:45:32.304598 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.304827 kubelet[2941]: W0904 15:45:32.304603 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.304827 kubelet[2941]: E0904 15:45:32.304608 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.304827 kubelet[2941]: E0904 15:45:32.304697 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.304827 kubelet[2941]: W0904 15:45:32.304712 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.304827 kubelet[2941]: E0904 15:45:32.304720 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.304827 kubelet[2941]: E0904 15:45:32.304825 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.305054 kubelet[2941]: W0904 15:45:32.304830 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.305054 kubelet[2941]: E0904 15:45:32.304837 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.322228 kubelet[2941]: E0904 15:45:32.322139 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.322228 kubelet[2941]: W0904 15:45:32.322155 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.322228 kubelet[2941]: E0904 15:45:32.322168 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.322500 kubelet[2941]: E0904 15:45:32.322467 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.322500 kubelet[2941]: W0904 15:45:32.322474 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.322500 kubelet[2941]: E0904 15:45:32.322479 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.322629 kubelet[2941]: E0904 15:45:32.322615 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.322657 kubelet[2941]: W0904 15:45:32.322629 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.322657 kubelet[2941]: E0904 15:45:32.322638 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.322778 kubelet[2941]: E0904 15:45:32.322765 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.322778 kubelet[2941]: W0904 15:45:32.322777 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.322828 kubelet[2941]: E0904 15:45:32.322786 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.322905 kubelet[2941]: E0904 15:45:32.322888 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.322905 kubelet[2941]: W0904 15:45:32.322899 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.322956 kubelet[2941]: E0904 15:45:32.322907 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.323029 kubelet[2941]: E0904 15:45:32.323020 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.323029 kubelet[2941]: W0904 15:45:32.323028 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.323072 kubelet[2941]: E0904 15:45:32.323034 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.323532 kubelet[2941]: E0904 15:45:32.323521 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.323563 kubelet[2941]: W0904 15:45:32.323530 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.323563 kubelet[2941]: E0904 15:45:32.323548 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.323656 kubelet[2941]: E0904 15:45:32.323648 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.323683 kubelet[2941]: W0904 15:45:32.323656 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.323683 kubelet[2941]: E0904 15:45:32.323662 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.323813 kubelet[2941]: E0904 15:45:32.323804 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.323813 kubelet[2941]: W0904 15:45:32.323811 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.323851 kubelet[2941]: E0904 15:45:32.323818 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.323987 kubelet[2941]: E0904 15:45:32.323910 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.323987 kubelet[2941]: W0904 15:45:32.323920 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.323987 kubelet[2941]: E0904 15:45:32.323926 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.324062 kubelet[2941]: E0904 15:45:32.324001 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.324062 kubelet[2941]: W0904 15:45:32.324005 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.324062 kubelet[2941]: E0904 15:45:32.324010 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.324132 kubelet[2941]: E0904 15:45:32.324121 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.324132 kubelet[2941]: W0904 15:45:32.324131 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.324186 kubelet[2941]: E0904 15:45:32.324138 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.324332 kubelet[2941]: E0904 15:45:32.324324 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.324414 kubelet[2941]: W0904 15:45:32.324359 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.324414 kubelet[2941]: E0904 15:45:32.324368 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.324490 kubelet[2941]: E0904 15:45:32.324485 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.324571 kubelet[2941]: W0904 15:45:32.324522 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.324571 kubelet[2941]: E0904 15:45:32.324529 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.324654 kubelet[2941]: E0904 15:45:32.324648 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.324749 kubelet[2941]: W0904 15:45:32.324681 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.324749 kubelet[2941]: E0904 15:45:32.324688 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.324818 kubelet[2941]: E0904 15:45:32.324813 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.324851 kubelet[2941]: W0904 15:45:32.324845 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.324983 kubelet[2941]: E0904 15:45:32.324882 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.325074 kubelet[2941]: E0904 15:45:32.325063 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.325098 kubelet[2941]: W0904 15:45:32.325074 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.325098 kubelet[2941]: E0904 15:45:32.325082 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:32.325197 kubelet[2941]: E0904 15:45:32.325186 2941 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:45:32.325197 kubelet[2941]: W0904 15:45:32.325196 2941 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:45:32.325261 kubelet[2941]: E0904 15:45:32.325202 2941 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:45:34.151610 kubelet[2941]: E0904 15:45:34.151299 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:36.151147 kubelet[2941]: E0904 15:45:36.151116 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:37.570532 containerd[1628]: time="2025-09-04T15:45:37.570473595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:37.572614 containerd[1628]: time="2025-09-04T15:45:37.572597972Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 4 15:45:37.577355 containerd[1628]: time="2025-09-04T15:45:37.577338755Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:37.602774 containerd[1628]: time="2025-09-04T15:45:37.582091163Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:37.603030 containerd[1628]: time="2025-09-04T15:45:37.582940016Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 7.549116384s" Sep 4 15:45:37.603030 containerd[1628]: time="2025-09-04T15:45:37.602963027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 4 15:45:37.605680 containerd[1628]: time="2025-09-04T15:45:37.605544705Z" level=info msg="CreateContainer within sandbox \"cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 4 15:45:37.610272 containerd[1628]: time="2025-09-04T15:45:37.608322522Z" level=info msg="Container bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:37.625589 containerd[1628]: time="2025-09-04T15:45:37.625563460Z" level=info msg="CreateContainer within sandbox \"cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf\"" Sep 4 15:45:37.627172 containerd[1628]: time="2025-09-04T15:45:37.627108412Z" level=info msg="StartContainer for \"bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf\"" Sep 4 15:45:37.628997 containerd[1628]: time="2025-09-04T15:45:37.628920148Z" level=info msg="connecting to shim bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf" address="unix:///run/containerd/s/fec43038c72f9d26840a5188372090ae1ff44ffe0750d581493fd4320da66e76" protocol=ttrpc version=3 Sep 4 15:45:37.647334 systemd[1]: Started cri-containerd-bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf.scope - libcontainer container bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf. Sep 4 15:45:37.671591 containerd[1628]: time="2025-09-04T15:45:37.671563520Z" level=info msg="StartContainer for \"bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf\" returns successfully" Sep 4 15:45:37.679285 systemd[1]: cri-containerd-bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf.scope: Deactivated successfully. Sep 4 15:45:37.679742 systemd[1]: cri-containerd-bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf.scope: Consumed 17ms CPU time, 6.1M memory peak, 4.4M written to disk. Sep 4 15:45:37.686004 containerd[1628]: time="2025-09-04T15:45:37.685982483Z" level=info msg="received exit event container_id:\"bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf\" id:\"bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf\" pid:3665 exited_at:{seconds:1757000737 nanos:681401365}" Sep 4 15:45:37.722786 containerd[1628]: time="2025-09-04T15:45:37.722735541Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf\" id:\"bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf\" pid:3665 exited_at:{seconds:1757000737 nanos:681401365}" Sep 4 15:45:37.745680 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bc7325047ea1d9801c73fbd1316ad3d5ca9f58d0244d65be9e6008a028ff9fbf-rootfs.mount: Deactivated successfully. Sep 4 15:45:38.150769 kubelet[2941]: E0904 15:45:38.150725 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:38.228630 containerd[1628]: time="2025-09-04T15:45:38.228493806Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 4 15:45:40.150662 kubelet[2941]: E0904 15:45:40.150413 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:42.151147 kubelet[2941]: E0904 15:45:42.151113 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:44.150630 kubelet[2941]: E0904 15:45:44.150563 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:45.066678 kubelet[2941]: I0904 15:45:45.066597 2941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 15:45:46.152999 kubelet[2941]: E0904 15:45:46.152964 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:48.151150 kubelet[2941]: E0904 15:45:48.151125 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:48.459301 containerd[1628]: time="2025-09-04T15:45:48.459172784Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:48.460340 containerd[1628]: time="2025-09-04T15:45:48.460252694Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 4 15:45:48.460792 containerd[1628]: time="2025-09-04T15:45:48.460771091Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:48.462205 containerd[1628]: time="2025-09-04T15:45:48.461934224Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:48.462494 containerd[1628]: time="2025-09-04T15:45:48.462473597Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 10.233939993s" Sep 4 15:45:48.462533 containerd[1628]: time="2025-09-04T15:45:48.462493648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 4 15:45:48.466226 containerd[1628]: time="2025-09-04T15:45:48.466183772Z" level=info msg="CreateContainer within sandbox \"cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 4 15:45:48.472474 containerd[1628]: time="2025-09-04T15:45:48.472453575Z" level=info msg="Container 53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:48.480144 containerd[1628]: time="2025-09-04T15:45:48.480115766Z" level=info msg="CreateContainer within sandbox \"cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07\"" Sep 4 15:45:48.481329 containerd[1628]: time="2025-09-04T15:45:48.481280854Z" level=info msg="StartContainer for \"53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07\"" Sep 4 15:45:48.482733 containerd[1628]: time="2025-09-04T15:45:48.482670106Z" level=info msg="connecting to shim 53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07" address="unix:///run/containerd/s/fec43038c72f9d26840a5188372090ae1ff44ffe0750d581493fd4320da66e76" protocol=ttrpc version=3 Sep 4 15:45:48.501321 systemd[1]: Started cri-containerd-53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07.scope - libcontainer container 53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07. Sep 4 15:45:48.527945 containerd[1628]: time="2025-09-04T15:45:48.527916294Z" level=info msg="StartContainer for \"53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07\" returns successfully" Sep 4 15:45:50.152383 systemd[1]: cri-containerd-53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07.scope: Deactivated successfully. Sep 4 15:45:50.152850 systemd[1]: cri-containerd-53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07.scope: Consumed 305ms CPU time, 164.5M memory peak, 12K read from disk, 171.3M written to disk. Sep 4 15:45:50.161974 kubelet[2941]: E0904 15:45:50.161828 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:50.216119 containerd[1628]: time="2025-09-04T15:45:50.215993317Z" level=info msg="received exit event container_id:\"53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07\" id:\"53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07\" pid:3725 exited_at:{seconds:1757000750 nanos:199745944}" Sep 4 15:45:50.216596 containerd[1628]: time="2025-09-04T15:45:50.216584460Z" level=info msg="TaskExit event in podsandbox handler container_id:\"53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07\" id:\"53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07\" pid:3725 exited_at:{seconds:1757000750 nanos:199745944}" Sep 4 15:45:50.222483 kubelet[2941]: I0904 15:45:50.222305 2941 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 4 15:45:50.242872 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-53d85b9929796e2846689d28b4853a49be5d5cf3f4c7a08cae5d00eb26d75a07-rootfs.mount: Deactivated successfully. Sep 4 15:45:50.329728 containerd[1628]: time="2025-09-04T15:45:50.329700017Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 4 15:45:50.335485 systemd[1]: Created slice kubepods-burstable-pod4021cb54_f775_42c6_8c2d_347373397b4c.slice - libcontainer container kubepods-burstable-pod4021cb54_f775_42c6_8c2d_347373397b4c.slice. Sep 4 15:45:50.345297 systemd[1]: Created slice kubepods-besteffort-poddc9999d0_33a9_4740_82f2_aed68797fc6a.slice - libcontainer container kubepods-besteffort-poddc9999d0_33a9_4740_82f2_aed68797fc6a.slice. Sep 4 15:45:50.353465 systemd[1]: Created slice kubepods-burstable-pod414a4df1_cdb6_4410_bde7_af50171f20b1.slice - libcontainer container kubepods-burstable-pod414a4df1_cdb6_4410_bde7_af50171f20b1.slice. Sep 4 15:45:50.360241 systemd[1]: Created slice kubepods-besteffort-pode7c6b0b2_eb4f_4eb1_bf74_9724b7e76c0c.slice - libcontainer container kubepods-besteffort-pode7c6b0b2_eb4f_4eb1_bf74_9724b7e76c0c.slice. Sep 4 15:45:50.364365 systemd[1]: Created slice kubepods-besteffort-podf83fd5a3_b207_4864_a1a1_a47b4a06a29b.slice - libcontainer container kubepods-besteffort-podf83fd5a3_b207_4864_a1a1_a47b4a06a29b.slice. Sep 4 15:45:50.371397 systemd[1]: Created slice kubepods-besteffort-pod019b852b_6bcc_4d05_aa9a_2ce02951a1bd.slice - libcontainer container kubepods-besteffort-pod019b852b_6bcc_4d05_aa9a_2ce02951a1bd.slice. Sep 4 15:45:50.378891 systemd[1]: Created slice kubepods-besteffort-pod10bcd7e2_385d_4877_9718_a3ecf232751c.slice - libcontainer container kubepods-besteffort-pod10bcd7e2_385d_4877_9718_a3ecf232751c.slice. Sep 4 15:45:50.465753 kubelet[2941]: I0904 15:45:50.465647 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2vc4\" (UniqueName: \"kubernetes.io/projected/dc9999d0-33a9-4740-82f2-aed68797fc6a-kube-api-access-k2vc4\") pod \"whisker-6ff7c7f4f9-b67mx\" (UID: \"dc9999d0-33a9-4740-82f2-aed68797fc6a\") " pod="calico-system/whisker-6ff7c7f4f9-b67mx" Sep 4 15:45:50.465753 kubelet[2941]: I0904 15:45:50.465683 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t7vr\" (UniqueName: \"kubernetes.io/projected/f83fd5a3-b207-4864-a1a1-a47b4a06a29b-kube-api-access-8t7vr\") pod \"calico-kube-controllers-5bc74776b8-jcx9h\" (UID: \"f83fd5a3-b207-4864-a1a1-a47b4a06a29b\") " pod="calico-system/calico-kube-controllers-5bc74776b8-jcx9h" Sep 4 15:45:50.465753 kubelet[2941]: I0904 15:45:50.465698 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7jz\" (UniqueName: \"kubernetes.io/projected/4021cb54-f775-42c6-8c2d-347373397b4c-kube-api-access-nr7jz\") pod \"coredns-674b8bbfcf-rqkgl\" (UID: \"4021cb54-f775-42c6-8c2d-347373397b4c\") " pod="kube-system/coredns-674b8bbfcf-rqkgl" Sep 4 15:45:50.465753 kubelet[2941]: I0904 15:45:50.465707 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9j92\" (UniqueName: \"kubernetes.io/projected/414a4df1-cdb6-4410-bde7-af50171f20b1-kube-api-access-k9j92\") pod \"coredns-674b8bbfcf-wm96q\" (UID: \"414a4df1-cdb6-4410-bde7-af50171f20b1\") " pod="kube-system/coredns-674b8bbfcf-wm96q" Sep 4 15:45:50.465753 kubelet[2941]: I0904 15:45:50.465717 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmpx\" (UniqueName: \"kubernetes.io/projected/10bcd7e2-385d-4877-9718-a3ecf232751c-kube-api-access-kzmpx\") pod \"calico-apiserver-5696449574-g89ks\" (UID: \"10bcd7e2-385d-4877-9718-a3ecf232751c\") " pod="calico-apiserver/calico-apiserver-5696449574-g89ks" Sep 4 15:45:50.468663 kubelet[2941]: I0904 15:45:50.465736 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/019b852b-6bcc-4d05-aa9a-2ce02951a1bd-config\") pod \"goldmane-54d579b49d-vbl6h\" (UID: \"019b852b-6bcc-4d05-aa9a-2ce02951a1bd\") " pod="calico-system/goldmane-54d579b49d-vbl6h" Sep 4 15:45:50.468663 kubelet[2941]: I0904 15:45:50.465748 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/019b852b-6bcc-4d05-aa9a-2ce02951a1bd-goldmane-key-pair\") pod \"goldmane-54d579b49d-vbl6h\" (UID: \"019b852b-6bcc-4d05-aa9a-2ce02951a1bd\") " pod="calico-system/goldmane-54d579b49d-vbl6h" Sep 4 15:45:50.468663 kubelet[2941]: I0904 15:45:50.465758 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdltq\" (UniqueName: \"kubernetes.io/projected/019b852b-6bcc-4d05-aa9a-2ce02951a1bd-kube-api-access-fdltq\") pod \"goldmane-54d579b49d-vbl6h\" (UID: \"019b852b-6bcc-4d05-aa9a-2ce02951a1bd\") " pod="calico-system/goldmane-54d579b49d-vbl6h" Sep 4 15:45:50.468663 kubelet[2941]: I0904 15:45:50.465767 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4021cb54-f775-42c6-8c2d-347373397b4c-config-volume\") pod \"coredns-674b8bbfcf-rqkgl\" (UID: \"4021cb54-f775-42c6-8c2d-347373397b4c\") " pod="kube-system/coredns-674b8bbfcf-rqkgl" Sep 4 15:45:50.468663 kubelet[2941]: I0904 15:45:50.465778 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c-calico-apiserver-certs\") pod \"calico-apiserver-5696449574-m4mxw\" (UID: \"e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c\") " pod="calico-apiserver/calico-apiserver-5696449574-m4mxw" Sep 4 15:45:50.470395 kubelet[2941]: I0904 15:45:50.465787 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414a4df1-cdb6-4410-bde7-af50171f20b1-config-volume\") pod \"coredns-674b8bbfcf-wm96q\" (UID: \"414a4df1-cdb6-4410-bde7-af50171f20b1\") " pod="kube-system/coredns-674b8bbfcf-wm96q" Sep 4 15:45:50.470395 kubelet[2941]: I0904 15:45:50.465797 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc9999d0-33a9-4740-82f2-aed68797fc6a-whisker-ca-bundle\") pod \"whisker-6ff7c7f4f9-b67mx\" (UID: \"dc9999d0-33a9-4740-82f2-aed68797fc6a\") " pod="calico-system/whisker-6ff7c7f4f9-b67mx" Sep 4 15:45:50.470395 kubelet[2941]: I0904 15:45:50.465807 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/019b852b-6bcc-4d05-aa9a-2ce02951a1bd-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-vbl6h\" (UID: \"019b852b-6bcc-4d05-aa9a-2ce02951a1bd\") " pod="calico-system/goldmane-54d579b49d-vbl6h" Sep 4 15:45:50.470395 kubelet[2941]: I0904 15:45:50.465823 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f83fd5a3-b207-4864-a1a1-a47b4a06a29b-tigera-ca-bundle\") pod \"calico-kube-controllers-5bc74776b8-jcx9h\" (UID: \"f83fd5a3-b207-4864-a1a1-a47b4a06a29b\") " pod="calico-system/calico-kube-controllers-5bc74776b8-jcx9h" Sep 4 15:45:50.470395 kubelet[2941]: I0904 15:45:50.465840 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dc9999d0-33a9-4740-82f2-aed68797fc6a-whisker-backend-key-pair\") pod \"whisker-6ff7c7f4f9-b67mx\" (UID: \"dc9999d0-33a9-4740-82f2-aed68797fc6a\") " pod="calico-system/whisker-6ff7c7f4f9-b67mx" Sep 4 15:45:50.471623 kubelet[2941]: I0904 15:45:50.465852 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb6gx\" (UniqueName: \"kubernetes.io/projected/e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c-kube-api-access-lb6gx\") pod \"calico-apiserver-5696449574-m4mxw\" (UID: \"e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c\") " pod="calico-apiserver/calico-apiserver-5696449574-m4mxw" Sep 4 15:45:50.471623 kubelet[2941]: I0904 15:45:50.465861 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/10bcd7e2-385d-4877-9718-a3ecf232751c-calico-apiserver-certs\") pod \"calico-apiserver-5696449574-g89ks\" (UID: \"10bcd7e2-385d-4877-9718-a3ecf232751c\") " pod="calico-apiserver/calico-apiserver-5696449574-g89ks" Sep 4 15:45:50.648437 containerd[1628]: time="2025-09-04T15:45:50.648312536Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6ff7c7f4f9-b67mx,Uid:dc9999d0-33a9-4740-82f2-aed68797fc6a,Namespace:calico-system,Attempt:0,}" Sep 4 15:45:50.652645 containerd[1628]: time="2025-09-04T15:45:50.652360493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-rqkgl,Uid:4021cb54-f775-42c6-8c2d-347373397b4c,Namespace:kube-system,Attempt:0,}" Sep 4 15:45:50.663199 containerd[1628]: time="2025-09-04T15:45:50.663174367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5696449574-m4mxw,Uid:e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:45:50.670258 containerd[1628]: time="2025-09-04T15:45:50.670023620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wm96q,Uid:414a4df1-cdb6-4410-bde7-af50171f20b1,Namespace:kube-system,Attempt:0,}" Sep 4 15:45:50.670720 containerd[1628]: time="2025-09-04T15:45:50.670702248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bc74776b8-jcx9h,Uid:f83fd5a3-b207-4864-a1a1-a47b4a06a29b,Namespace:calico-system,Attempt:0,}" Sep 4 15:45:50.690649 containerd[1628]: time="2025-09-04T15:45:50.690625029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5696449574-g89ks,Uid:10bcd7e2-385d-4877-9718-a3ecf232751c,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:45:50.694991 containerd[1628]: time="2025-09-04T15:45:50.694836312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-vbl6h,Uid:019b852b-6bcc-4d05-aa9a-2ce02951a1bd,Namespace:calico-system,Attempt:0,}" Sep 4 15:45:51.056891 containerd[1628]: time="2025-09-04T15:45:51.056843620Z" level=error msg="Failed to destroy network for sandbox \"87060f3d92012d46870e9e225f4d348a3a52f2c33f096ea64e8b68370e561c19\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.057640 containerd[1628]: time="2025-09-04T15:45:51.057610177Z" level=error msg="Failed to destroy network for sandbox \"bf0752c6bb3a800bc232cec4496ce0a4c55e0f8277b8c0b1c3bdbe70c060fb3e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.057995 containerd[1628]: time="2025-09-04T15:45:51.057981677Z" level=error msg="Failed to destroy network for sandbox \"10edd1a4311c56512988ecab7852a6e579f37546d93b45d9fd36c43987885f5a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.058501 containerd[1628]: time="2025-09-04T15:45:51.058047826Z" level=error msg="Failed to destroy network for sandbox \"3b0bc4e772a3eab49e0e74f06ced1bca6319f6cb86f58fe5228a36446eaf1df0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.058642 containerd[1628]: time="2025-09-04T15:45:51.058069246Z" level=error msg="Failed to destroy network for sandbox \"9d2f5e2ec5812b191c7c9dec96bdd9a310a726d022b8464987981e2905deb496\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.058843 containerd[1628]: time="2025-09-04T15:45:51.057812453Z" level=error msg="Failed to destroy network for sandbox \"c9697f7ec0bb4a7d181cfa13ed8f040e3f5d2991e23443b7e485ab6875f46dad\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.058970 containerd[1628]: time="2025-09-04T15:45:51.058941268Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6ff7c7f4f9-b67mx,Uid:dc9999d0-33a9-4740-82f2-aed68797fc6a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf0752c6bb3a800bc232cec4496ce0a4c55e0f8277b8c0b1c3bdbe70c060fb3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.059381 containerd[1628]: time="2025-09-04T15:45:51.058123992Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wm96q,Uid:414a4df1-cdb6-4410-bde7-af50171f20b1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"87060f3d92012d46870e9e225f4d348a3a52f2c33f096ea64e8b68370e561c19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.073353 containerd[1628]: time="2025-09-04T15:45:51.073254282Z" level=error msg="Failed to destroy network for sandbox \"098befa5df2065739e4eaef2840848569721d43f1722673d7795a9c57c12aa5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.073464 kubelet[2941]: E0904 15:45:51.073401 2941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87060f3d92012d46870e9e225f4d348a3a52f2c33f096ea64e8b68370e561c19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.073935 kubelet[2941]: E0904 15:45:51.073539 2941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf0752c6bb3a800bc232cec4496ce0a4c55e0f8277b8c0b1c3bdbe70c060fb3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.073935 kubelet[2941]: E0904 15:45:51.073565 2941 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf0752c6bb3a800bc232cec4496ce0a4c55e0f8277b8c0b1c3bdbe70c060fb3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6ff7c7f4f9-b67mx" Sep 4 15:45:51.073935 kubelet[2941]: E0904 15:45:51.073580 2941 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf0752c6bb3a800bc232cec4496ce0a4c55e0f8277b8c0b1c3bdbe70c060fb3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6ff7c7f4f9-b67mx" Sep 4 15:45:51.074028 containerd[1628]: time="2025-09-04T15:45:51.073572502Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-rqkgl,Uid:4021cb54-f775-42c6-8c2d-347373397b4c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d2f5e2ec5812b191c7c9dec96bdd9a310a726d022b8464987981e2905deb496\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.074028 containerd[1628]: time="2025-09-04T15:45:51.073632511Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5696449574-m4mxw,Uid:e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9697f7ec0bb4a7d181cfa13ed8f040e3f5d2991e23443b7e485ab6875f46dad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.074028 containerd[1628]: time="2025-09-04T15:45:51.073661265Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bc74776b8-jcx9h,Uid:f83fd5a3-b207-4864-a1a1-a47b4a06a29b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"10edd1a4311c56512988ecab7852a6e579f37546d93b45d9fd36c43987885f5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.074155 kubelet[2941]: E0904 15:45:51.073614 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6ff7c7f4f9-b67mx_calico-system(dc9999d0-33a9-4740-82f2-aed68797fc6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6ff7c7f4f9-b67mx_calico-system(dc9999d0-33a9-4740-82f2-aed68797fc6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bf0752c6bb3a800bc232cec4496ce0a4c55e0f8277b8c0b1c3bdbe70c060fb3e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6ff7c7f4f9-b67mx" podUID="dc9999d0-33a9-4740-82f2-aed68797fc6a" Sep 4 15:45:51.074155 kubelet[2941]: E0904 15:45:51.073522 2941 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87060f3d92012d46870e9e225f4d348a3a52f2c33f096ea64e8b68370e561c19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-wm96q" Sep 4 15:45:51.074155 kubelet[2941]: E0904 15:45:51.073731 2941 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87060f3d92012d46870e9e225f4d348a3a52f2c33f096ea64e8b68370e561c19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-wm96q" Sep 4 15:45:51.074275 containerd[1628]: time="2025-09-04T15:45:51.073680795Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5696449574-g89ks,Uid:10bcd7e2-385d-4877-9718-a3ecf232751c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b0bc4e772a3eab49e0e74f06ced1bca6319f6cb86f58fe5228a36446eaf1df0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.074316 kubelet[2941]: E0904 15:45:51.073754 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-wm96q_kube-system(414a4df1-cdb6-4410-bde7-af50171f20b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-wm96q_kube-system(414a4df1-cdb6-4410-bde7-af50171f20b1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"87060f3d92012d46870e9e225f4d348a3a52f2c33f096ea64e8b68370e561c19\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-wm96q" podUID="414a4df1-cdb6-4410-bde7-af50171f20b1" Sep 4 15:45:51.074316 kubelet[2941]: E0904 15:45:51.073922 2941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b0bc4e772a3eab49e0e74f06ced1bca6319f6cb86f58fe5228a36446eaf1df0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.074375 kubelet[2941]: E0904 15:45:51.074331 2941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d2f5e2ec5812b191c7c9dec96bdd9a310a726d022b8464987981e2905deb496\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.074375 kubelet[2941]: E0904 15:45:51.074354 2941 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d2f5e2ec5812b191c7c9dec96bdd9a310a726d022b8464987981e2905deb496\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-rqkgl" Sep 4 15:45:51.074375 kubelet[2941]: E0904 15:45:51.074369 2941 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d2f5e2ec5812b191c7c9dec96bdd9a310a726d022b8464987981e2905deb496\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-rqkgl" Sep 4 15:45:51.075790 kubelet[2941]: E0904 15:45:51.074392 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-rqkgl_kube-system(4021cb54-f775-42c6-8c2d-347373397b4c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-rqkgl_kube-system(4021cb54-f775-42c6-8c2d-347373397b4c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9d2f5e2ec5812b191c7c9dec96bdd9a310a726d022b8464987981e2905deb496\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-rqkgl" podUID="4021cb54-f775-42c6-8c2d-347373397b4c" Sep 4 15:45:51.075790 kubelet[2941]: E0904 15:45:51.074415 2941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9697f7ec0bb4a7d181cfa13ed8f040e3f5d2991e23443b7e485ab6875f46dad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.075790 kubelet[2941]: E0904 15:45:51.074425 2941 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9697f7ec0bb4a7d181cfa13ed8f040e3f5d2991e23443b7e485ab6875f46dad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5696449574-m4mxw" Sep 4 15:45:51.075922 kubelet[2941]: E0904 15:45:51.074434 2941 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9697f7ec0bb4a7d181cfa13ed8f040e3f5d2991e23443b7e485ab6875f46dad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5696449574-m4mxw" Sep 4 15:45:51.075922 kubelet[2941]: E0904 15:45:51.074449 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5696449574-m4mxw_calico-apiserver(e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5696449574-m4mxw_calico-apiserver(e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c9697f7ec0bb4a7d181cfa13ed8f040e3f5d2991e23443b7e485ab6875f46dad\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5696449574-m4mxw" podUID="e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c" Sep 4 15:45:51.075922 kubelet[2941]: E0904 15:45:51.074472 2941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10edd1a4311c56512988ecab7852a6e579f37546d93b45d9fd36c43987885f5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.076323 kubelet[2941]: E0904 15:45:51.074482 2941 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10edd1a4311c56512988ecab7852a6e579f37546d93b45d9fd36c43987885f5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5bc74776b8-jcx9h" Sep 4 15:45:51.076323 kubelet[2941]: E0904 15:45:51.074489 2941 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10edd1a4311c56512988ecab7852a6e579f37546d93b45d9fd36c43987885f5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5bc74776b8-jcx9h" Sep 4 15:45:51.076323 kubelet[2941]: E0904 15:45:51.074503 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5bc74776b8-jcx9h_calico-system(f83fd5a3-b207-4864-a1a1-a47b4a06a29b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5bc74776b8-jcx9h_calico-system(f83fd5a3-b207-4864-a1a1-a47b4a06a29b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"10edd1a4311c56512988ecab7852a6e579f37546d93b45d9fd36c43987885f5a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5bc74776b8-jcx9h" podUID="f83fd5a3-b207-4864-a1a1-a47b4a06a29b" Sep 4 15:45:51.076405 containerd[1628]: time="2025-09-04T15:45:51.076107428Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-vbl6h,Uid:019b852b-6bcc-4d05-aa9a-2ce02951a1bd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"098befa5df2065739e4eaef2840848569721d43f1722673d7795a9c57c12aa5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.076443 kubelet[2941]: E0904 15:45:51.073938 2941 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b0bc4e772a3eab49e0e74f06ced1bca6319f6cb86f58fe5228a36446eaf1df0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5696449574-g89ks" Sep 4 15:45:51.076443 kubelet[2941]: E0904 15:45:51.074544 2941 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b0bc4e772a3eab49e0e74f06ced1bca6319f6cb86f58fe5228a36446eaf1df0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5696449574-g89ks" Sep 4 15:45:51.076443 kubelet[2941]: E0904 15:45:51.074568 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5696449574-g89ks_calico-apiserver(10bcd7e2-385d-4877-9718-a3ecf232751c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5696449574-g89ks_calico-apiserver(10bcd7e2-385d-4877-9718-a3ecf232751c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b0bc4e772a3eab49e0e74f06ced1bca6319f6cb86f58fe5228a36446eaf1df0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5696449574-g89ks" podUID="10bcd7e2-385d-4877-9718-a3ecf232751c" Sep 4 15:45:51.076662 kubelet[2941]: E0904 15:45:51.076558 2941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"098befa5df2065739e4eaef2840848569721d43f1722673d7795a9c57c12aa5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:51.076662 kubelet[2941]: E0904 15:45:51.076591 2941 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"098befa5df2065739e4eaef2840848569721d43f1722673d7795a9c57c12aa5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-vbl6h" Sep 4 15:45:51.076662 kubelet[2941]: E0904 15:45:51.076604 2941 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"098befa5df2065739e4eaef2840848569721d43f1722673d7795a9c57c12aa5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-vbl6h" Sep 4 15:45:51.076736 kubelet[2941]: E0904 15:45:51.076633 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-vbl6h_calico-system(019b852b-6bcc-4d05-aa9a-2ce02951a1bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-vbl6h_calico-system(019b852b-6bcc-4d05-aa9a-2ce02951a1bd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"098befa5df2065739e4eaef2840848569721d43f1722673d7795a9c57c12aa5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-vbl6h" podUID="019b852b-6bcc-4d05-aa9a-2ce02951a1bd" Sep 4 15:45:52.156141 systemd[1]: Created slice kubepods-besteffort-pod6fe34a63_17a4_4039_80b1_075eaa32bbb7.slice - libcontainer container kubepods-besteffort-pod6fe34a63_17a4_4039_80b1_075eaa32bbb7.slice. Sep 4 15:45:52.161018 containerd[1628]: time="2025-09-04T15:45:52.160922678Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-48dj6,Uid:6fe34a63-17a4-4039-80b1-075eaa32bbb7,Namespace:calico-system,Attempt:0,}" Sep 4 15:45:52.205699 containerd[1628]: time="2025-09-04T15:45:52.205667530Z" level=error msg="Failed to destroy network for sandbox \"f1d3b5c531e092c827c19c16277f4c8d02501728d0c059e4313882a5b0e992e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:52.207071 systemd[1]: run-netns-cni\x2d7f9a3208\x2ddc65\x2db494\x2d8c1c\x2d5b2f38e242a4.mount: Deactivated successfully. Sep 4 15:45:52.210387 containerd[1628]: time="2025-09-04T15:45:52.210347165Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-48dj6,Uid:6fe34a63-17a4-4039-80b1-075eaa32bbb7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1d3b5c531e092c827c19c16277f4c8d02501728d0c059e4313882a5b0e992e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:52.211332 kubelet[2941]: E0904 15:45:52.211307 2941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1d3b5c531e092c827c19c16277f4c8d02501728d0c059e4313882a5b0e992e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:45:52.211512 kubelet[2941]: E0904 15:45:52.211346 2941 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1d3b5c531e092c827c19c16277f4c8d02501728d0c059e4313882a5b0e992e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-48dj6" Sep 4 15:45:52.211512 kubelet[2941]: E0904 15:45:52.211360 2941 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1d3b5c531e092c827c19c16277f4c8d02501728d0c059e4313882a5b0e992e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-48dj6" Sep 4 15:45:52.211512 kubelet[2941]: E0904 15:45:52.211395 2941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-48dj6_calico-system(6fe34a63-17a4-4039-80b1-075eaa32bbb7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-48dj6_calico-system(6fe34a63-17a4-4039-80b1-075eaa32bbb7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f1d3b5c531e092c827c19c16277f4c8d02501728d0c059e4313882a5b0e992e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-48dj6" podUID="6fe34a63-17a4-4039-80b1-075eaa32bbb7" Sep 4 15:45:56.572768 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2264619065.mount: Deactivated successfully. Sep 4 15:45:56.683254 containerd[1628]: time="2025-09-04T15:45:56.682548514Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 4 15:45:56.688496 containerd[1628]: time="2025-09-04T15:45:56.675255502Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:56.700997 containerd[1628]: time="2025-09-04T15:45:56.700739009Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:56.703804 containerd[1628]: time="2025-09-04T15:45:56.703789985Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:45:56.705840 containerd[1628]: time="2025-09-04T15:45:56.705819119Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.374280968s" Sep 4 15:45:56.705878 containerd[1628]: time="2025-09-04T15:45:56.705843256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 4 15:45:56.750753 containerd[1628]: time="2025-09-04T15:45:56.750629615Z" level=info msg="CreateContainer within sandbox \"cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 4 15:45:56.852889 containerd[1628]: time="2025-09-04T15:45:56.852718561Z" level=info msg="Container 7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:45:56.853567 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1325975347.mount: Deactivated successfully. Sep 4 15:45:56.932245 containerd[1628]: time="2025-09-04T15:45:56.932204793Z" level=info msg="CreateContainer within sandbox \"cd5a22eb8970b2ef459cfd1e1ec1f8a090e43dfaa27b16ed0565c1ec5348a08f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\"" Sep 4 15:45:56.933173 containerd[1628]: time="2025-09-04T15:45:56.933101994Z" level=info msg="StartContainer for \"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\"" Sep 4 15:45:56.949680 containerd[1628]: time="2025-09-04T15:45:56.949659200Z" level=info msg="connecting to shim 7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef" address="unix:///run/containerd/s/fec43038c72f9d26840a5188372090ae1ff44ffe0750d581493fd4320da66e76" protocol=ttrpc version=3 Sep 4 15:45:57.029381 systemd[1]: Started cri-containerd-7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef.scope - libcontainer container 7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef. Sep 4 15:45:57.063380 containerd[1628]: time="2025-09-04T15:45:57.063349199Z" level=info msg="StartContainer for \"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\" returns successfully" Sep 4 15:45:57.176709 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 4 15:45:57.177988 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 4 15:45:57.359044 kubelet[2941]: I0904 15:45:57.352420 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6mb89" podStartSLOduration=1.798770158 podStartE2EDuration="34.352407643s" podCreationTimestamp="2025-09-04 15:45:23 +0000 UTC" firstStartedPulling="2025-09-04 15:45:24.152948931 +0000 UTC m=+17.182640649" lastFinishedPulling="2025-09-04 15:45:56.706586415 +0000 UTC m=+49.736278134" observedRunningTime="2025-09-04 15:45:57.351353742 +0000 UTC m=+50.381045458" watchObservedRunningTime="2025-09-04 15:45:57.352407643 +0000 UTC m=+50.382099371" Sep 4 15:45:57.678592 containerd[1628]: time="2025-09-04T15:45:57.678403856Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\" id:\"a0395b9de9f80d9e640a97f05495e6563f00cd4c6ae4b686a6b1b369887a334a\" pid:4056 exit_status:1 exited_at:{seconds:1757000757 nanos:666321313}" Sep 4 15:45:57.932016 kubelet[2941]: I0904 15:45:57.931846 2941 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2vc4\" (UniqueName: \"kubernetes.io/projected/dc9999d0-33a9-4740-82f2-aed68797fc6a-kube-api-access-k2vc4\") pod \"dc9999d0-33a9-4740-82f2-aed68797fc6a\" (UID: \"dc9999d0-33a9-4740-82f2-aed68797fc6a\") " Sep 4 15:45:57.932016 kubelet[2941]: I0904 15:45:57.931926 2941 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dc9999d0-33a9-4740-82f2-aed68797fc6a-whisker-backend-key-pair\") pod \"dc9999d0-33a9-4740-82f2-aed68797fc6a\" (UID: \"dc9999d0-33a9-4740-82f2-aed68797fc6a\") " Sep 4 15:45:57.932016 kubelet[2941]: I0904 15:45:57.931946 2941 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc9999d0-33a9-4740-82f2-aed68797fc6a-whisker-ca-bundle\") pod \"dc9999d0-33a9-4740-82f2-aed68797fc6a\" (UID: \"dc9999d0-33a9-4740-82f2-aed68797fc6a\") " Sep 4 15:45:58.012993 systemd[1]: var-lib-kubelet-pods-dc9999d0\x2d33a9\x2d4740\x2d82f2\x2daed68797fc6a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dk2vc4.mount: Deactivated successfully. Sep 4 15:45:58.013076 systemd[1]: var-lib-kubelet-pods-dc9999d0\x2d33a9\x2d4740\x2d82f2\x2daed68797fc6a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 4 15:45:58.015398 kubelet[2941]: I0904 15:45:58.015292 2941 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9999d0-33a9-4740-82f2-aed68797fc6a-kube-api-access-k2vc4" (OuterVolumeSpecName: "kube-api-access-k2vc4") pod "dc9999d0-33a9-4740-82f2-aed68797fc6a" (UID: "dc9999d0-33a9-4740-82f2-aed68797fc6a"). InnerVolumeSpecName "kube-api-access-k2vc4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 4 15:45:58.015765 kubelet[2941]: I0904 15:45:58.015657 2941 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9999d0-33a9-4740-82f2-aed68797fc6a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "dc9999d0-33a9-4740-82f2-aed68797fc6a" (UID: "dc9999d0-33a9-4740-82f2-aed68797fc6a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 4 15:45:58.015928 kubelet[2941]: I0904 15:45:58.015872 2941 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc9999d0-33a9-4740-82f2-aed68797fc6a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "dc9999d0-33a9-4740-82f2-aed68797fc6a" (UID: "dc9999d0-33a9-4740-82f2-aed68797fc6a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 4 15:45:58.032455 kubelet[2941]: I0904 15:45:58.032410 2941 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dc9999d0-33a9-4740-82f2-aed68797fc6a-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 4 15:45:58.032455 kubelet[2941]: I0904 15:45:58.032431 2941 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc9999d0-33a9-4740-82f2-aed68797fc6a-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 4 15:45:58.032455 kubelet[2941]: I0904 15:45:58.032439 2941 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k2vc4\" (UniqueName: \"kubernetes.io/projected/dc9999d0-33a9-4740-82f2-aed68797fc6a-kube-api-access-k2vc4\") on node \"localhost\" DevicePath \"\"" Sep 4 15:45:58.333325 systemd[1]: Removed slice kubepods-besteffort-poddc9999d0_33a9_4740_82f2_aed68797fc6a.slice - libcontainer container kubepods-besteffort-poddc9999d0_33a9_4740_82f2_aed68797fc6a.slice. Sep 4 15:45:58.412006 containerd[1628]: time="2025-09-04T15:45:58.411940693Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\" id:\"57b48accf6f092ded5c3eb2c1d223f5c29395759bd9cc7b1705e9585eb000ae2\" pid:4093 exit_status:1 exited_at:{seconds:1757000758 nanos:411357620}" Sep 4 15:45:58.731833 systemd[1]: Created slice kubepods-besteffort-podad296408_0f78_406f_9176_2f8f73c0731a.slice - libcontainer container kubepods-besteffort-podad296408_0f78_406f_9176_2f8f73c0731a.slice. Sep 4 15:45:58.838209 kubelet[2941]: I0904 15:45:58.838177 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad296408-0f78-406f-9176-2f8f73c0731a-whisker-ca-bundle\") pod \"whisker-64dbf6f99d-5tw6p\" (UID: \"ad296408-0f78-406f-9176-2f8f73c0731a\") " pod="calico-system/whisker-64dbf6f99d-5tw6p" Sep 4 15:45:58.838209 kubelet[2941]: I0904 15:45:58.838260 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ad296408-0f78-406f-9176-2f8f73c0731a-whisker-backend-key-pair\") pod \"whisker-64dbf6f99d-5tw6p\" (UID: \"ad296408-0f78-406f-9176-2f8f73c0731a\") " pod="calico-system/whisker-64dbf6f99d-5tw6p" Sep 4 15:45:58.838209 kubelet[2941]: I0904 15:45:58.838280 2941 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56mql\" (UniqueName: \"kubernetes.io/projected/ad296408-0f78-406f-9176-2f8f73c0731a-kube-api-access-56mql\") pod \"whisker-64dbf6f99d-5tw6p\" (UID: \"ad296408-0f78-406f-9176-2f8f73c0731a\") " pod="calico-system/whisker-64dbf6f99d-5tw6p" Sep 4 15:45:59.035690 containerd[1628]: time="2025-09-04T15:45:59.035452683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64dbf6f99d-5tw6p,Uid:ad296408-0f78-406f-9176-2f8f73c0731a,Namespace:calico-system,Attempt:0,}" Sep 4 15:45:59.154576 kubelet[2941]: I0904 15:45:59.154551 2941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc9999d0-33a9-4740-82f2-aed68797fc6a" path="/var/lib/kubelet/pods/dc9999d0-33a9-4740-82f2-aed68797fc6a/volumes" Sep 4 15:45:59.541465 containerd[1628]: time="2025-09-04T15:45:59.541362394Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\" id:\"281106bec70071751ad3e94277e65373c13f510aa09638b8c863cb7da5b834e9\" pid:4194 exit_status:1 exited_at:{seconds:1757000759 nanos:541096546}" Sep 4 15:45:59.915158 systemd-networkd[1518]: vxlan.calico: Link UP Sep 4 15:45:59.915163 systemd-networkd[1518]: vxlan.calico: Gained carrier Sep 4 15:46:00.123668 systemd-networkd[1518]: cali0c30eb96c79: Link UP Sep 4 15:46:00.124978 systemd-networkd[1518]: cali0c30eb96c79: Gained carrier Sep 4 15:46:00.143449 containerd[1628]: 2025-09-04 15:45:59.082 [INFO][4107] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 15:46:00.143449 containerd[1628]: 2025-09-04 15:45:59.174 [INFO][4107] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0 whisker-64dbf6f99d- calico-system ad296408-0f78-406f-9176-2f8f73c0731a 922 0 2025-09-04 15:45:58 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:64dbf6f99d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-64dbf6f99d-5tw6p eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali0c30eb96c79 [] [] }} ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Namespace="calico-system" Pod="whisker-64dbf6f99d-5tw6p" WorkloadEndpoint="localhost-k8s-whisker--64dbf6f99d--5tw6p-" Sep 4 15:46:00.143449 containerd[1628]: 2025-09-04 15:45:59.174 [INFO][4107] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Namespace="calico-system" Pod="whisker-64dbf6f99d-5tw6p" WorkloadEndpoint="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" Sep 4 15:46:00.143449 containerd[1628]: 2025-09-04 15:45:59.917 [INFO][4118] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" HandleID="k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Workload="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:45:59.926 [INFO][4118] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" HandleID="k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Workload="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00037b7c0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-64dbf6f99d-5tw6p", "timestamp":"2025-09-04 15:45:59.917371003 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:45:59.926 [INFO][4118] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:45:59.926 [INFO][4118] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:45:59.930 [INFO][4118] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:46:00.020 [INFO][4118] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" host="localhost" Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:46:00.055 [INFO][4118] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:46:00.059 [INFO][4118] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:46:00.061 [INFO][4118] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:46:00.063 [INFO][4118] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:00.143655 containerd[1628]: 2025-09-04 15:46:00.063 [INFO][4118] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" host="localhost" Sep 4 15:46:00.143827 containerd[1628]: 2025-09-04 15:46:00.064 [INFO][4118] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba Sep 4 15:46:00.143827 containerd[1628]: 2025-09-04 15:46:00.071 [INFO][4118] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" host="localhost" Sep 4 15:46:00.143827 containerd[1628]: 2025-09-04 15:46:00.076 [INFO][4118] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" host="localhost" Sep 4 15:46:00.143827 containerd[1628]: 2025-09-04 15:46:00.076 [INFO][4118] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" host="localhost" Sep 4 15:46:00.143827 containerd[1628]: 2025-09-04 15:46:00.076 [INFO][4118] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:46:00.143827 containerd[1628]: 2025-09-04 15:46:00.076 [INFO][4118] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" HandleID="k8s-pod-network.5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Workload="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" Sep 4 15:46:00.144822 containerd[1628]: 2025-09-04 15:46:00.081 [INFO][4107] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Namespace="calico-system" Pod="whisker-64dbf6f99d-5tw6p" WorkloadEndpoint="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0", GenerateName:"whisker-64dbf6f99d-", Namespace:"calico-system", SelfLink:"", UID:"ad296408-0f78-406f-9176-2f8f73c0731a", ResourceVersion:"922", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"64dbf6f99d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-64dbf6f99d-5tw6p", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali0c30eb96c79", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:00.144822 containerd[1628]: 2025-09-04 15:46:00.082 [INFO][4107] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Namespace="calico-system" Pod="whisker-64dbf6f99d-5tw6p" WorkloadEndpoint="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" Sep 4 15:46:00.144897 containerd[1628]: 2025-09-04 15:46:00.082 [INFO][4107] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0c30eb96c79 ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Namespace="calico-system" Pod="whisker-64dbf6f99d-5tw6p" WorkloadEndpoint="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" Sep 4 15:46:00.144897 containerd[1628]: 2025-09-04 15:46:00.126 [INFO][4107] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Namespace="calico-system" Pod="whisker-64dbf6f99d-5tw6p" WorkloadEndpoint="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" Sep 4 15:46:00.144931 containerd[1628]: 2025-09-04 15:46:00.126 [INFO][4107] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Namespace="calico-system" Pod="whisker-64dbf6f99d-5tw6p" WorkloadEndpoint="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0", GenerateName:"whisker-64dbf6f99d-", Namespace:"calico-system", SelfLink:"", UID:"ad296408-0f78-406f-9176-2f8f73c0731a", ResourceVersion:"922", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"64dbf6f99d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba", Pod:"whisker-64dbf6f99d-5tw6p", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali0c30eb96c79", MAC:"e6:e0:b6:18:59:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:00.145336 containerd[1628]: 2025-09-04 15:46:00.138 [INFO][4107] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" Namespace="calico-system" Pod="whisker-64dbf6f99d-5tw6p" WorkloadEndpoint="localhost-k8s-whisker--64dbf6f99d--5tw6p-eth0" Sep 4 15:46:00.277724 containerd[1628]: time="2025-09-04T15:46:00.277649827Z" level=info msg="connecting to shim 5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba" address="unix:///run/containerd/s/45a0127192281221f4cb99275a2fe6a1af30cd183050c721a04f9ac8d37a3805" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:46:00.307351 systemd[1]: Started cri-containerd-5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba.scope - libcontainer container 5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba. Sep 4 15:46:00.368198 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:46:00.443628 containerd[1628]: time="2025-09-04T15:46:00.443600089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64dbf6f99d-5tw6p,Uid:ad296408-0f78-406f-9176-2f8f73c0731a,Namespace:calico-system,Attempt:0,} returns sandbox id \"5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba\"" Sep 4 15:46:00.519058 containerd[1628]: time="2025-09-04T15:46:00.519026304Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 4 15:46:01.188346 systemd-networkd[1518]: vxlan.calico: Gained IPv6LL Sep 4 15:46:01.764365 systemd-networkd[1518]: cali0c30eb96c79: Gained IPv6LL Sep 4 15:46:03.152237 containerd[1628]: time="2025-09-04T15:46:03.151945658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-vbl6h,Uid:019b852b-6bcc-4d05-aa9a-2ce02951a1bd,Namespace:calico-system,Attempt:0,}" Sep 4 15:46:03.152237 containerd[1628]: time="2025-09-04T15:46:03.152136832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-48dj6,Uid:6fe34a63-17a4-4039-80b1-075eaa32bbb7,Namespace:calico-system,Attempt:0,}" Sep 4 15:46:03.152674 containerd[1628]: time="2025-09-04T15:46:03.152586350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5696449574-m4mxw,Uid:e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:46:03.274523 systemd-networkd[1518]: calie2633513c49: Link UP Sep 4 15:46:03.276526 systemd-networkd[1518]: calie2633513c49: Gained carrier Sep 4 15:46:03.294036 containerd[1628]: 2025-09-04 15:46:03.191 [INFO][4400] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--vbl6h-eth0 goldmane-54d579b49d- calico-system 019b852b-6bcc-4d05-aa9a-2ce02951a1bd 854 0 2025-09-04 15:45:23 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-vbl6h eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie2633513c49 [] [] }} ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Namespace="calico-system" Pod="goldmane-54d579b49d-vbl6h" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--vbl6h-" Sep 4 15:46:03.294036 containerd[1628]: 2025-09-04 15:46:03.192 [INFO][4400] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Namespace="calico-system" Pod="goldmane-54d579b49d-vbl6h" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" Sep 4 15:46:03.294036 containerd[1628]: 2025-09-04 15:46:03.240 [INFO][4436] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" HandleID="k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Workload="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.240 [INFO][4436] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" HandleID="k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Workload="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ac880), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-vbl6h", "timestamp":"2025-09-04 15:46:03.24041501 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.240 [INFO][4436] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.240 [INFO][4436] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.240 [INFO][4436] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.247 [INFO][4436] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" host="localhost" Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.253 [INFO][4436] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.257 [INFO][4436] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.258 [INFO][4436] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.259 [INFO][4436] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:03.294175 containerd[1628]: 2025-09-04 15:46:03.260 [INFO][4436] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" host="localhost" Sep 4 15:46:03.294500 containerd[1628]: 2025-09-04 15:46:03.261 [INFO][4436] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9 Sep 4 15:46:03.294500 containerd[1628]: 2025-09-04 15:46:03.263 [INFO][4436] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" host="localhost" Sep 4 15:46:03.294500 containerd[1628]: 2025-09-04 15:46:03.267 [INFO][4436] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" host="localhost" Sep 4 15:46:03.294500 containerd[1628]: 2025-09-04 15:46:03.267 [INFO][4436] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" host="localhost" Sep 4 15:46:03.294500 containerd[1628]: 2025-09-04 15:46:03.267 [INFO][4436] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:46:03.294500 containerd[1628]: 2025-09-04 15:46:03.267 [INFO][4436] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" HandleID="k8s-pod-network.4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Workload="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" Sep 4 15:46:03.294985 containerd[1628]: 2025-09-04 15:46:03.272 [INFO][4400] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Namespace="calico-system" Pod="goldmane-54d579b49d-vbl6h" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--vbl6h-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"019b852b-6bcc-4d05-aa9a-2ce02951a1bd", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-vbl6h", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie2633513c49", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:03.294985 containerd[1628]: 2025-09-04 15:46:03.272 [INFO][4400] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Namespace="calico-system" Pod="goldmane-54d579b49d-vbl6h" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" Sep 4 15:46:03.295052 containerd[1628]: 2025-09-04 15:46:03.272 [INFO][4400] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie2633513c49 ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Namespace="calico-system" Pod="goldmane-54d579b49d-vbl6h" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" Sep 4 15:46:03.295052 containerd[1628]: 2025-09-04 15:46:03.277 [INFO][4400] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Namespace="calico-system" Pod="goldmane-54d579b49d-vbl6h" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" Sep 4 15:46:03.296046 containerd[1628]: 2025-09-04 15:46:03.278 [INFO][4400] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Namespace="calico-system" Pod="goldmane-54d579b49d-vbl6h" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--vbl6h-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"019b852b-6bcc-4d05-aa9a-2ce02951a1bd", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9", Pod:"goldmane-54d579b49d-vbl6h", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie2633513c49", MAC:"96:7f:50:a4:b1:de", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:03.296109 containerd[1628]: 2025-09-04 15:46:03.287 [INFO][4400] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" Namespace="calico-system" Pod="goldmane-54d579b49d-vbl6h" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--vbl6h-eth0" Sep 4 15:46:03.313649 containerd[1628]: time="2025-09-04T15:46:03.313623557Z" level=info msg="connecting to shim 4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9" address="unix:///run/containerd/s/187dd0cf640c40ce71bece50e81b4eece8ee49add80da0fd2c7c386ef204a44b" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:46:03.333326 systemd[1]: Started cri-containerd-4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9.scope - libcontainer container 4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9. Sep 4 15:46:03.343352 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:46:03.373482 containerd[1628]: time="2025-09-04T15:46:03.373454966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-vbl6h,Uid:019b852b-6bcc-4d05-aa9a-2ce02951a1bd,Namespace:calico-system,Attempt:0,} returns sandbox id \"4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9\"" Sep 4 15:46:03.395768 systemd-networkd[1518]: cali76f33f23d43: Link UP Sep 4 15:46:03.397756 systemd-networkd[1518]: cali76f33f23d43: Gained carrier Sep 4 15:46:03.416401 containerd[1628]: 2025-09-04 15:46:03.211 [INFO][4420] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0 calico-apiserver-5696449574- calico-apiserver e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c 853 0 2025-09-04 15:45:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5696449574 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5696449574-m4mxw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali76f33f23d43 [] [] }} ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-m4mxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--m4mxw-" Sep 4 15:46:03.416401 containerd[1628]: 2025-09-04 15:46:03.211 [INFO][4420] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-m4mxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" Sep 4 15:46:03.416401 containerd[1628]: 2025-09-04 15:46:03.247 [INFO][4442] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" HandleID="k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Workload="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.248 [INFO][4442] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" HandleID="k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Workload="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f0a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5696449574-m4mxw", "timestamp":"2025-09-04 15:46:03.247927899 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.248 [INFO][4442] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.267 [INFO][4442] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.267 [INFO][4442] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.348 [INFO][4442] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" host="localhost" Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.370 [INFO][4442] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.373 [INFO][4442] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.374 [INFO][4442] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.376 [INFO][4442] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:03.416538 containerd[1628]: 2025-09-04 15:46:03.377 [INFO][4442] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" host="localhost" Sep 4 15:46:03.416738 containerd[1628]: 2025-09-04 15:46:03.377 [INFO][4442] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d Sep 4 15:46:03.416738 containerd[1628]: 2025-09-04 15:46:03.382 [INFO][4442] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" host="localhost" Sep 4 15:46:03.416738 containerd[1628]: 2025-09-04 15:46:03.389 [INFO][4442] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" host="localhost" Sep 4 15:46:03.416738 containerd[1628]: 2025-09-04 15:46:03.389 [INFO][4442] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" host="localhost" Sep 4 15:46:03.416738 containerd[1628]: 2025-09-04 15:46:03.389 [INFO][4442] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:46:03.416738 containerd[1628]: 2025-09-04 15:46:03.389 [INFO][4442] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" HandleID="k8s-pod-network.bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Workload="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" Sep 4 15:46:03.416894 containerd[1628]: 2025-09-04 15:46:03.391 [INFO][4420] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-m4mxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0", GenerateName:"calico-apiserver-5696449574-", Namespace:"calico-apiserver", SelfLink:"", UID:"e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5696449574", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5696449574-m4mxw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali76f33f23d43", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:03.417956 containerd[1628]: 2025-09-04 15:46:03.392 [INFO][4420] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-m4mxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" Sep 4 15:46:03.417956 containerd[1628]: 2025-09-04 15:46:03.392 [INFO][4420] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76f33f23d43 ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-m4mxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" Sep 4 15:46:03.417956 containerd[1628]: 2025-09-04 15:46:03.400 [INFO][4420] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-m4mxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" Sep 4 15:46:03.418014 containerd[1628]: 2025-09-04 15:46:03.403 [INFO][4420] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-m4mxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0", GenerateName:"calico-apiserver-5696449574-", Namespace:"calico-apiserver", SelfLink:"", UID:"e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5696449574", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d", Pod:"calico-apiserver-5696449574-m4mxw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali76f33f23d43", MAC:"b2:c5:07:bf:b9:ea", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:03.418058 containerd[1628]: 2025-09-04 15:46:03.414 [INFO][4420] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-m4mxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--m4mxw-eth0" Sep 4 15:46:03.448492 containerd[1628]: time="2025-09-04T15:46:03.448453519Z" level=info msg="connecting to shim bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d" address="unix:///run/containerd/s/954d5e4560a559b9542031427ba98633fe1aa4f62be3212e2f509753697677eb" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:46:03.472420 systemd[1]: Started cri-containerd-bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d.scope - libcontainer container bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d. Sep 4 15:46:03.492184 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:46:03.493519 systemd-networkd[1518]: calib85370cb540: Link UP Sep 4 15:46:03.493687 systemd-networkd[1518]: calib85370cb540: Gained carrier Sep 4 15:46:03.505918 containerd[1628]: 2025-09-04 15:46:03.231 [INFO][4408] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--48dj6-eth0 csi-node-driver- calico-system 6fe34a63-17a4-4039-80b1-075eaa32bbb7 689 0 2025-09-04 15:45:23 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-48dj6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib85370cb540 [] [] }} ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Namespace="calico-system" Pod="csi-node-driver-48dj6" WorkloadEndpoint="localhost-k8s-csi--node--driver--48dj6-" Sep 4 15:46:03.505918 containerd[1628]: 2025-09-04 15:46:03.231 [INFO][4408] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Namespace="calico-system" Pod="csi-node-driver-48dj6" WorkloadEndpoint="localhost-k8s-csi--node--driver--48dj6-eth0" Sep 4 15:46:03.505918 containerd[1628]: 2025-09-04 15:46:03.265 [INFO][4448] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" HandleID="k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Workload="localhost-k8s-csi--node--driver--48dj6-eth0" Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.265 [INFO][4448] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" HandleID="k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Workload="localhost-k8s-csi--node--driver--48dj6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5710), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-48dj6", "timestamp":"2025-09-04 15:46:03.265413573 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.265 [INFO][4448] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.389 [INFO][4448] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.389 [INFO][4448] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.449 [INFO][4448] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" host="localhost" Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.472 [INFO][4448] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.475 [INFO][4448] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.478 [INFO][4448] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.480 [INFO][4448] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:03.506064 containerd[1628]: 2025-09-04 15:46:03.480 [INFO][4448] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" host="localhost" Sep 4 15:46:03.507484 containerd[1628]: 2025-09-04 15:46:03.481 [INFO][4448] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f Sep 4 15:46:03.507484 containerd[1628]: 2025-09-04 15:46:03.484 [INFO][4448] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" host="localhost" Sep 4 15:46:03.507484 containerd[1628]: 2025-09-04 15:46:03.488 [INFO][4448] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" host="localhost" Sep 4 15:46:03.507484 containerd[1628]: 2025-09-04 15:46:03.488 [INFO][4448] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" host="localhost" Sep 4 15:46:03.507484 containerd[1628]: 2025-09-04 15:46:03.488 [INFO][4448] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:46:03.507484 containerd[1628]: 2025-09-04 15:46:03.488 [INFO][4448] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" HandleID="k8s-pod-network.a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Workload="localhost-k8s-csi--node--driver--48dj6-eth0" Sep 4 15:46:03.507585 containerd[1628]: 2025-09-04 15:46:03.490 [INFO][4408] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Namespace="calico-system" Pod="csi-node-driver-48dj6" WorkloadEndpoint="localhost-k8s-csi--node--driver--48dj6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--48dj6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6fe34a63-17a4-4039-80b1-075eaa32bbb7", ResourceVersion:"689", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-48dj6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib85370cb540", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:03.507884 containerd[1628]: 2025-09-04 15:46:03.490 [INFO][4408] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Namespace="calico-system" Pod="csi-node-driver-48dj6" WorkloadEndpoint="localhost-k8s-csi--node--driver--48dj6-eth0" Sep 4 15:46:03.507884 containerd[1628]: 2025-09-04 15:46:03.490 [INFO][4408] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib85370cb540 ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Namespace="calico-system" Pod="csi-node-driver-48dj6" WorkloadEndpoint="localhost-k8s-csi--node--driver--48dj6-eth0" Sep 4 15:46:03.507884 containerd[1628]: 2025-09-04 15:46:03.494 [INFO][4408] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Namespace="calico-system" Pod="csi-node-driver-48dj6" WorkloadEndpoint="localhost-k8s-csi--node--driver--48dj6-eth0" Sep 4 15:46:03.508006 containerd[1628]: 2025-09-04 15:46:03.495 [INFO][4408] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Namespace="calico-system" Pod="csi-node-driver-48dj6" WorkloadEndpoint="localhost-k8s-csi--node--driver--48dj6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--48dj6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6fe34a63-17a4-4039-80b1-075eaa32bbb7", ResourceVersion:"689", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f", Pod:"csi-node-driver-48dj6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib85370cb540", MAC:"ea:ca:0b:8f:60:65", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:03.508053 containerd[1628]: 2025-09-04 15:46:03.502 [INFO][4408] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" Namespace="calico-system" Pod="csi-node-driver-48dj6" WorkloadEndpoint="localhost-k8s-csi--node--driver--48dj6-eth0" Sep 4 15:46:03.524550 containerd[1628]: time="2025-09-04T15:46:03.524321357Z" level=info msg="connecting to shim a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f" address="unix:///run/containerd/s/6a3c62dba4fccb4f65950122ad166644f890f13585bb59078f7e4ed8e94ff152" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:46:03.548171 containerd[1628]: time="2025-09-04T15:46:03.548129898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5696449574-m4mxw,Uid:e7c6b0b2-eb4f-4eb1-bf74-9724b7e76c0c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d\"" Sep 4 15:46:03.548327 systemd[1]: Started cri-containerd-a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f.scope - libcontainer container a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f. Sep 4 15:46:03.556832 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:46:03.570570 containerd[1628]: time="2025-09-04T15:46:03.570540141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-48dj6,Uid:6fe34a63-17a4-4039-80b1-075eaa32bbb7,Namespace:calico-system,Attempt:0,} returns sandbox id \"a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f\"" Sep 4 15:46:04.151483 containerd[1628]: time="2025-09-04T15:46:04.151451037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wm96q,Uid:414a4df1-cdb6-4410-bde7-af50171f20b1,Namespace:kube-system,Attempt:0,}" Sep 4 15:46:04.237675 systemd-networkd[1518]: cali68f48d3b4e3: Link UP Sep 4 15:46:04.238448 systemd-networkd[1518]: cali68f48d3b4e3: Gained carrier Sep 4 15:46:04.253850 containerd[1628]: 2025-09-04 15:46:04.191 [INFO][4623] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--wm96q-eth0 coredns-674b8bbfcf- kube-system 414a4df1-cdb6-4410-bde7-af50171f20b1 852 0 2025-09-04 15:45:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-wm96q eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali68f48d3b4e3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Namespace="kube-system" Pod="coredns-674b8bbfcf-wm96q" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--wm96q-" Sep 4 15:46:04.253850 containerd[1628]: 2025-09-04 15:46:04.191 [INFO][4623] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Namespace="kube-system" Pod="coredns-674b8bbfcf-wm96q" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" Sep 4 15:46:04.253850 containerd[1628]: 2025-09-04 15:46:04.207 [INFO][4635] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" HandleID="k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Workload="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.207 [INFO][4635] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" HandleID="k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Workload="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f610), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-wm96q", "timestamp":"2025-09-04 15:46:04.20715832 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.207 [INFO][4635] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.207 [INFO][4635] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.207 [INFO][4635] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.211 [INFO][4635] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" host="localhost" Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.213 [INFO][4635] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.216 [INFO][4635] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.217 [INFO][4635] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.218 [INFO][4635] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:04.257725 containerd[1628]: 2025-09-04 15:46:04.218 [INFO][4635] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" host="localhost" Sep 4 15:46:04.258171 containerd[1628]: 2025-09-04 15:46:04.219 [INFO][4635] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7 Sep 4 15:46:04.258171 containerd[1628]: 2025-09-04 15:46:04.223 [INFO][4635] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" host="localhost" Sep 4 15:46:04.258171 containerd[1628]: 2025-09-04 15:46:04.233 [INFO][4635] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" host="localhost" Sep 4 15:46:04.258171 containerd[1628]: 2025-09-04 15:46:04.233 [INFO][4635] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" host="localhost" Sep 4 15:46:04.258171 containerd[1628]: 2025-09-04 15:46:04.233 [INFO][4635] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:46:04.258171 containerd[1628]: 2025-09-04 15:46:04.233 [INFO][4635] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" HandleID="k8s-pod-network.26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Workload="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" Sep 4 15:46:04.258292 containerd[1628]: 2025-09-04 15:46:04.235 [INFO][4623] cni-plugin/k8s.go 418: Populated endpoint ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Namespace="kube-system" Pod="coredns-674b8bbfcf-wm96q" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--wm96q-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"414a4df1-cdb6-4410-bde7-af50171f20b1", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-wm96q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali68f48d3b4e3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:04.258344 containerd[1628]: 2025-09-04 15:46:04.235 [INFO][4623] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Namespace="kube-system" Pod="coredns-674b8bbfcf-wm96q" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" Sep 4 15:46:04.258344 containerd[1628]: 2025-09-04 15:46:04.235 [INFO][4623] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68f48d3b4e3 ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Namespace="kube-system" Pod="coredns-674b8bbfcf-wm96q" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" Sep 4 15:46:04.258344 containerd[1628]: 2025-09-04 15:46:04.238 [INFO][4623] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Namespace="kube-system" Pod="coredns-674b8bbfcf-wm96q" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" Sep 4 15:46:04.258396 containerd[1628]: 2025-09-04 15:46:04.239 [INFO][4623] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Namespace="kube-system" Pod="coredns-674b8bbfcf-wm96q" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--wm96q-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"414a4df1-cdb6-4410-bde7-af50171f20b1", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7", Pod:"coredns-674b8bbfcf-wm96q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali68f48d3b4e3", MAC:"16:3a:af:61:89:e4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:04.258396 containerd[1628]: 2025-09-04 15:46:04.252 [INFO][4623] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" Namespace="kube-system" Pod="coredns-674b8bbfcf-wm96q" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--wm96q-eth0" Sep 4 15:46:04.298340 containerd[1628]: time="2025-09-04T15:46:04.298271984Z" level=info msg="connecting to shim 26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7" address="unix:///run/containerd/s/11a9aace9d91795e4be740404d7185d9896e19abc9ea509851d4d95eb796d195" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:46:04.317327 systemd[1]: Started cri-containerd-26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7.scope - libcontainer container 26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7. Sep 4 15:46:04.326950 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:46:04.356844 containerd[1628]: time="2025-09-04T15:46:04.356753690Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wm96q,Uid:414a4df1-cdb6-4410-bde7-af50171f20b1,Namespace:kube-system,Attempt:0,} returns sandbox id \"26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7\"" Sep 4 15:46:04.452367 systemd-networkd[1518]: cali76f33f23d43: Gained IPv6LL Sep 4 15:46:04.489693 containerd[1628]: time="2025-09-04T15:46:04.489664154Z" level=info msg="CreateContainer within sandbox \"26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 15:46:04.646557 containerd[1628]: time="2025-09-04T15:46:04.646507626Z" level=info msg="Container df5573536ab6834c5ddc3d35a27da538d91beb53dc307573f95f6685d6cfc1f2: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:04.679247 containerd[1628]: time="2025-09-04T15:46:04.679181279Z" level=info msg="CreateContainer within sandbox \"26debd057babeedbc75a87d4fae83d7e6d007b86800402fb1fd8e7c49a85fcd7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"df5573536ab6834c5ddc3d35a27da538d91beb53dc307573f95f6685d6cfc1f2\"" Sep 4 15:46:04.680727 containerd[1628]: time="2025-09-04T15:46:04.679721095Z" level=info msg="StartContainer for \"df5573536ab6834c5ddc3d35a27da538d91beb53dc307573f95f6685d6cfc1f2\"" Sep 4 15:46:04.681561 containerd[1628]: time="2025-09-04T15:46:04.681502009Z" level=info msg="connecting to shim df5573536ab6834c5ddc3d35a27da538d91beb53dc307573f95f6685d6cfc1f2" address="unix:///run/containerd/s/11a9aace9d91795e4be740404d7185d9896e19abc9ea509851d4d95eb796d195" protocol=ttrpc version=3 Sep 4 15:46:04.698343 systemd[1]: Started cri-containerd-df5573536ab6834c5ddc3d35a27da538d91beb53dc307573f95f6685d6cfc1f2.scope - libcontainer container df5573536ab6834c5ddc3d35a27da538d91beb53dc307573f95f6685d6cfc1f2. Sep 4 15:46:04.772496 systemd-networkd[1518]: calie2633513c49: Gained IPv6LL Sep 4 15:46:04.830698 containerd[1628]: time="2025-09-04T15:46:04.830665232Z" level=info msg="StartContainer for \"df5573536ab6834c5ddc3d35a27da538d91beb53dc307573f95f6685d6cfc1f2\" returns successfully" Sep 4 15:46:04.901340 systemd-networkd[1518]: calib85370cb540: Gained IPv6LL Sep 4 15:46:05.152204 containerd[1628]: time="2025-09-04T15:46:05.151935012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5696449574-g89ks,Uid:10bcd7e2-385d-4877-9718-a3ecf232751c,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:46:05.156365 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2115981004.mount: Deactivated successfully. Sep 4 15:46:05.243309 systemd-networkd[1518]: calif04fe80336b: Link UP Sep 4 15:46:05.243707 systemd-networkd[1518]: calif04fe80336b: Gained carrier Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.198 [INFO][4725] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5696449574--g89ks-eth0 calico-apiserver-5696449574- calico-apiserver 10bcd7e2-385d-4877-9718-a3ecf232751c 855 0 2025-09-04 15:45:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5696449574 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5696449574-g89ks eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif04fe80336b [] [] }} ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-g89ks" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--g89ks-" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.198 [INFO][4725] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-g89ks" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.214 [INFO][4736] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" HandleID="k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Workload="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.214 [INFO][4736] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" HandleID="k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Workload="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f230), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5696449574-g89ks", "timestamp":"2025-09-04 15:46:05.214861065 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.214 [INFO][4736] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.215 [INFO][4736] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.215 [INFO][4736] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.221 [INFO][4736] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.225 [INFO][4736] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.228 [INFO][4736] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.230 [INFO][4736] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.231 [INFO][4736] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.231 [INFO][4736] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.232 [INFO][4736] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5 Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.235 [INFO][4736] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.238 [INFO][4736] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.238 [INFO][4736] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" host="localhost" Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.238 [INFO][4736] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:46:05.254897 containerd[1628]: 2025-09-04 15:46:05.238 [INFO][4736] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" HandleID="k8s-pod-network.515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Workload="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" Sep 4 15:46:05.257679 containerd[1628]: 2025-09-04 15:46:05.240 [INFO][4725] cni-plugin/k8s.go 418: Populated endpoint ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-g89ks" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5696449574--g89ks-eth0", GenerateName:"calico-apiserver-5696449574-", Namespace:"calico-apiserver", SelfLink:"", UID:"10bcd7e2-385d-4877-9718-a3ecf232751c", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5696449574", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5696449574-g89ks", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif04fe80336b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:05.257679 containerd[1628]: 2025-09-04 15:46:05.240 [INFO][4725] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-g89ks" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" Sep 4 15:46:05.257679 containerd[1628]: 2025-09-04 15:46:05.241 [INFO][4725] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif04fe80336b ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-g89ks" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" Sep 4 15:46:05.257679 containerd[1628]: 2025-09-04 15:46:05.243 [INFO][4725] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-g89ks" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" Sep 4 15:46:05.257679 containerd[1628]: 2025-09-04 15:46:05.244 [INFO][4725] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-g89ks" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5696449574--g89ks-eth0", GenerateName:"calico-apiserver-5696449574-", Namespace:"calico-apiserver", SelfLink:"", UID:"10bcd7e2-385d-4877-9718-a3ecf232751c", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5696449574", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5", Pod:"calico-apiserver-5696449574-g89ks", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif04fe80336b", MAC:"f6:b7:c7:65:b6:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:05.257679 containerd[1628]: 2025-09-04 15:46:05.250 [INFO][4725] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" Namespace="calico-apiserver" Pod="calico-apiserver-5696449574-g89ks" WorkloadEndpoint="localhost-k8s-calico--apiserver--5696449574--g89ks-eth0" Sep 4 15:46:05.272358 containerd[1628]: time="2025-09-04T15:46:05.272333971Z" level=info msg="connecting to shim 515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5" address="unix:///run/containerd/s/7e59ff705fa130e5b9b69e55c6d9c7b7ec49e867bfd2a60b8e96f55e33ea2c25" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:46:05.284376 systemd-networkd[1518]: cali68f48d3b4e3: Gained IPv6LL Sep 4 15:46:05.297397 systemd[1]: Started cri-containerd-515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5.scope - libcontainer container 515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5. Sep 4 15:46:05.307113 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:46:05.335574 containerd[1628]: time="2025-09-04T15:46:05.335551260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5696449574-g89ks,Uid:10bcd7e2-385d-4877-9718-a3ecf232751c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5\"" Sep 4 15:46:05.471834 kubelet[2941]: I0904 15:46:05.457002 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-wm96q" podStartSLOduration=51.422833197 podStartE2EDuration="51.422833197s" podCreationTimestamp="2025-09-04 15:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:46:05.422610088 +0000 UTC m=+58.452301816" watchObservedRunningTime="2025-09-04 15:46:05.422833197 +0000 UTC m=+58.452524925" Sep 4 15:46:06.151594 containerd[1628]: time="2025-09-04T15:46:06.151529362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-rqkgl,Uid:4021cb54-f775-42c6-8c2d-347373397b4c,Namespace:kube-system,Attempt:0,}" Sep 4 15:46:06.151889 containerd[1628]: time="2025-09-04T15:46:06.151755503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bc74776b8-jcx9h,Uid:f83fd5a3-b207-4864-a1a1-a47b4a06a29b,Namespace:calico-system,Attempt:0,}" Sep 4 15:46:06.240783 systemd-networkd[1518]: cali349f858f93b: Link UP Sep 4 15:46:06.241533 systemd-networkd[1518]: cali349f858f93b: Gained carrier Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.190 [INFO][4812] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0 calico-kube-controllers-5bc74776b8- calico-system f83fd5a3-b207-4864-a1a1-a47b4a06a29b 857 0 2025-09-04 15:45:24 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5bc74776b8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-5bc74776b8-jcx9h eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali349f858f93b [] [] }} ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Namespace="calico-system" Pod="calico-kube-controllers-5bc74776b8-jcx9h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.191 [INFO][4812] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Namespace="calico-system" Pod="calico-kube-controllers-5bc74776b8-jcx9h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.214 [INFO][4836] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" HandleID="k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Workload="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.214 [INFO][4836] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" HandleID="k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Workload="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f730), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-5bc74776b8-jcx9h", "timestamp":"2025-09-04 15:46:06.214089167 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.214 [INFO][4836] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.214 [INFO][4836] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.214 [INFO][4836] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.219 [INFO][4836] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.222 [INFO][4836] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.225 [INFO][4836] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.225 [INFO][4836] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.227 [INFO][4836] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.227 [INFO][4836] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.228 [INFO][4836] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73 Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.229 [INFO][4836] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.233 [INFO][4836] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.233 [INFO][4836] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" host="localhost" Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.233 [INFO][4836] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:46:06.255029 containerd[1628]: 2025-09-04 15:46:06.233 [INFO][4836] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" HandleID="k8s-pod-network.30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Workload="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" Sep 4 15:46:06.256138 containerd[1628]: 2025-09-04 15:46:06.236 [INFO][4812] cni-plugin/k8s.go 418: Populated endpoint ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Namespace="calico-system" Pod="calico-kube-controllers-5bc74776b8-jcx9h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0", GenerateName:"calico-kube-controllers-5bc74776b8-", Namespace:"calico-system", SelfLink:"", UID:"f83fd5a3-b207-4864-a1a1-a47b4a06a29b", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bc74776b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-5bc74776b8-jcx9h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali349f858f93b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:06.256138 containerd[1628]: 2025-09-04 15:46:06.237 [INFO][4812] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Namespace="calico-system" Pod="calico-kube-controllers-5bc74776b8-jcx9h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" Sep 4 15:46:06.256138 containerd[1628]: 2025-09-04 15:46:06.237 [INFO][4812] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali349f858f93b ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Namespace="calico-system" Pod="calico-kube-controllers-5bc74776b8-jcx9h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" Sep 4 15:46:06.256138 containerd[1628]: 2025-09-04 15:46:06.241 [INFO][4812] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Namespace="calico-system" Pod="calico-kube-controllers-5bc74776b8-jcx9h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" Sep 4 15:46:06.256138 containerd[1628]: 2025-09-04 15:46:06.241 [INFO][4812] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Namespace="calico-system" Pod="calico-kube-controllers-5bc74776b8-jcx9h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0", GenerateName:"calico-kube-controllers-5bc74776b8-", Namespace:"calico-system", SelfLink:"", UID:"f83fd5a3-b207-4864-a1a1-a47b4a06a29b", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bc74776b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73", Pod:"calico-kube-controllers-5bc74776b8-jcx9h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali349f858f93b", MAC:"c2:13:9f:16:d6:97", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:06.256138 containerd[1628]: 2025-09-04 15:46:06.252 [INFO][4812] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" Namespace="calico-system" Pod="calico-kube-controllers-5bc74776b8-jcx9h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5bc74776b8--jcx9h-eth0" Sep 4 15:46:06.272865 containerd[1628]: time="2025-09-04T15:46:06.272827074Z" level=info msg="connecting to shim 30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73" address="unix:///run/containerd/s/3afa6fcea3dd375cbd401f1866576371ccd97c8cf9774b6abae537109984d6f1" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:46:06.293329 systemd[1]: Started cri-containerd-30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73.scope - libcontainer container 30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73. Sep 4 15:46:06.305674 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:46:06.348261 containerd[1628]: time="2025-09-04T15:46:06.348071546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bc74776b8-jcx9h,Uid:f83fd5a3-b207-4864-a1a1-a47b4a06a29b,Namespace:calico-system,Attempt:0,} returns sandbox id \"30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73\"" Sep 4 15:46:06.349384 systemd-networkd[1518]: cali8868c1a965f: Link UP Sep 4 15:46:06.349953 systemd-networkd[1518]: cali8868c1a965f: Gained carrier Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.190 [INFO][4808] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0 coredns-674b8bbfcf- kube-system 4021cb54-f775-42c6-8c2d-347373397b4c 851 0 2025-09-04 15:45:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-rqkgl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8868c1a965f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Namespace="kube-system" Pod="coredns-674b8bbfcf-rqkgl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--rqkgl-" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.190 [INFO][4808] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Namespace="kube-system" Pod="coredns-674b8bbfcf-rqkgl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.220 [INFO][4834] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" HandleID="k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Workload="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.221 [INFO][4834] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" HandleID="k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Workload="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-rqkgl", "timestamp":"2025-09-04 15:46:06.220432897 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.221 [INFO][4834] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.233 [INFO][4834] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.233 [INFO][4834] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.319 [INFO][4834] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.323 [INFO][4834] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.327 [INFO][4834] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.330 [INFO][4834] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.333 [INFO][4834] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.333 [INFO][4834] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.334 [INFO][4834] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.337 [INFO][4834] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.341 [INFO][4834] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.342 [INFO][4834] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" host="localhost" Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.342 [INFO][4834] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:46:06.360836 containerd[1628]: 2025-09-04 15:46:06.342 [INFO][4834] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" HandleID="k8s-pod-network.92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Workload="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" Sep 4 15:46:06.363703 containerd[1628]: 2025-09-04 15:46:06.346 [INFO][4808] cni-plugin/k8s.go 418: Populated endpoint ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Namespace="kube-system" Pod="coredns-674b8bbfcf-rqkgl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4021cb54-f775-42c6-8c2d-347373397b4c", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-rqkgl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8868c1a965f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:06.363703 containerd[1628]: 2025-09-04 15:46:06.346 [INFO][4808] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Namespace="kube-system" Pod="coredns-674b8bbfcf-rqkgl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" Sep 4 15:46:06.363703 containerd[1628]: 2025-09-04 15:46:06.346 [INFO][4808] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8868c1a965f ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Namespace="kube-system" Pod="coredns-674b8bbfcf-rqkgl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" Sep 4 15:46:06.363703 containerd[1628]: 2025-09-04 15:46:06.350 [INFO][4808] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Namespace="kube-system" Pod="coredns-674b8bbfcf-rqkgl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" Sep 4 15:46:06.363703 containerd[1628]: 2025-09-04 15:46:06.350 [INFO][4808] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Namespace="kube-system" Pod="coredns-674b8bbfcf-rqkgl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4021cb54-f775-42c6-8c2d-347373397b4c", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 45, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e", Pod:"coredns-674b8bbfcf-rqkgl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8868c1a965f", MAC:"b6:5d:d7:4e:bd:b4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:46:06.363703 containerd[1628]: 2025-09-04 15:46:06.359 [INFO][4808] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" Namespace="kube-system" Pod="coredns-674b8bbfcf-rqkgl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--rqkgl-eth0" Sep 4 15:46:06.373629 systemd-networkd[1518]: calif04fe80336b: Gained IPv6LL Sep 4 15:46:06.380397 containerd[1628]: time="2025-09-04T15:46:06.380282366Z" level=info msg="connecting to shim 92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e" address="unix:///run/containerd/s/1994e1c1017a81c3e762fab774bfa8e3102362d265d7acd326cdc9cf17ade1d5" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:46:06.406490 systemd[1]: Started cri-containerd-92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e.scope - libcontainer container 92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e. Sep 4 15:46:06.418660 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:46:06.445474 containerd[1628]: time="2025-09-04T15:46:06.445441291Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-rqkgl,Uid:4021cb54-f775-42c6-8c2d-347373397b4c,Namespace:kube-system,Attempt:0,} returns sandbox id \"92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e\"" Sep 4 15:46:06.448775 containerd[1628]: time="2025-09-04T15:46:06.448755506Z" level=info msg="CreateContainer within sandbox \"92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 15:46:06.517490 containerd[1628]: time="2025-09-04T15:46:06.517457403Z" level=info msg="Container 4b953270a5b4b01bd21e7a0519e1c77a594f9908b5796321ed5892eb37deed1c: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:06.520360 containerd[1628]: time="2025-09-04T15:46:06.520334056Z" level=info msg="CreateContainer within sandbox \"92acfb0b9bf846313081c37b1649430aaf402db6427968d2b0a777fd4c484a5e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4b953270a5b4b01bd21e7a0519e1c77a594f9908b5796321ed5892eb37deed1c\"" Sep 4 15:46:06.521025 containerd[1628]: time="2025-09-04T15:46:06.521008332Z" level=info msg="StartContainer for \"4b953270a5b4b01bd21e7a0519e1c77a594f9908b5796321ed5892eb37deed1c\"" Sep 4 15:46:06.521794 containerd[1628]: time="2025-09-04T15:46:06.521765012Z" level=info msg="connecting to shim 4b953270a5b4b01bd21e7a0519e1c77a594f9908b5796321ed5892eb37deed1c" address="unix:///run/containerd/s/1994e1c1017a81c3e762fab774bfa8e3102362d265d7acd326cdc9cf17ade1d5" protocol=ttrpc version=3 Sep 4 15:46:06.538318 systemd[1]: Started cri-containerd-4b953270a5b4b01bd21e7a0519e1c77a594f9908b5796321ed5892eb37deed1c.scope - libcontainer container 4b953270a5b4b01bd21e7a0519e1c77a594f9908b5796321ed5892eb37deed1c. Sep 4 15:46:06.557552 containerd[1628]: time="2025-09-04T15:46:06.557531545Z" level=info msg="StartContainer for \"4b953270a5b4b01bd21e7a0519e1c77a594f9908b5796321ed5892eb37deed1c\" returns successfully" Sep 4 15:46:07.372426 kubelet[2941]: I0904 15:46:07.372353 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-rqkgl" podStartSLOduration=53.372337108 podStartE2EDuration="53.372337108s" podCreationTimestamp="2025-09-04 15:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:46:07.372110066 +0000 UTC m=+60.401801789" watchObservedRunningTime="2025-09-04 15:46:07.372337108 +0000 UTC m=+60.402028836" Sep 4 15:46:07.780305 systemd-networkd[1518]: cali349f858f93b: Gained IPv6LL Sep 4 15:46:08.356311 systemd-networkd[1518]: cali8868c1a965f: Gained IPv6LL Sep 4 15:46:10.756562 containerd[1628]: time="2025-09-04T15:46:10.756464582Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:10.756982 containerd[1628]: time="2025-09-04T15:46:10.756881891Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 4 15:46:10.758040 containerd[1628]: time="2025-09-04T15:46:10.758027786Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:10.758623 containerd[1628]: time="2025-09-04T15:46:10.758602740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:10.759347 containerd[1628]: time="2025-09-04T15:46:10.759284310Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 10.240229901s" Sep 4 15:46:10.759347 containerd[1628]: time="2025-09-04T15:46:10.759299728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 4 15:46:10.759912 containerd[1628]: time="2025-09-04T15:46:10.759902633Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 4 15:46:10.764250 containerd[1628]: time="2025-09-04T15:46:10.764094345Z" level=info msg="CreateContainer within sandbox \"5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 4 15:46:10.770079 containerd[1628]: time="2025-09-04T15:46:10.770065400Z" level=info msg="Container 340e13f56f9fa811d3dfffc5b4d1d07074c8af9edf46a2529e48578c4b972e11: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:10.774718 containerd[1628]: time="2025-09-04T15:46:10.773653140Z" level=info msg="CreateContainer within sandbox \"5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"340e13f56f9fa811d3dfffc5b4d1d07074c8af9edf46a2529e48578c4b972e11\"" Sep 4 15:46:10.775733 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3733322872.mount: Deactivated successfully. Sep 4 15:46:10.780226 containerd[1628]: time="2025-09-04T15:46:10.777141680Z" level=info msg="StartContainer for \"340e13f56f9fa811d3dfffc5b4d1d07074c8af9edf46a2529e48578c4b972e11\"" Sep 4 15:46:10.780226 containerd[1628]: time="2025-09-04T15:46:10.777769075Z" level=info msg="connecting to shim 340e13f56f9fa811d3dfffc5b4d1d07074c8af9edf46a2529e48578c4b972e11" address="unix:///run/containerd/s/45a0127192281221f4cb99275a2fe6a1af30cd183050c721a04f9ac8d37a3805" protocol=ttrpc version=3 Sep 4 15:46:10.797449 systemd[1]: Started cri-containerd-340e13f56f9fa811d3dfffc5b4d1d07074c8af9edf46a2529e48578c4b972e11.scope - libcontainer container 340e13f56f9fa811d3dfffc5b4d1d07074c8af9edf46a2529e48578c4b972e11. Sep 4 15:46:10.833648 containerd[1628]: time="2025-09-04T15:46:10.833628204Z" level=info msg="StartContainer for \"340e13f56f9fa811d3dfffc5b4d1d07074c8af9edf46a2529e48578c4b972e11\" returns successfully" Sep 4 15:46:17.944747 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3060335270.mount: Deactivated successfully. Sep 4 15:46:19.211450 containerd[1628]: time="2025-09-04T15:46:19.211407971Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:19.212584 containerd[1628]: time="2025-09-04T15:46:19.212561699Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 4 15:46:19.213080 containerd[1628]: time="2025-09-04T15:46:19.213061161Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:19.245163 containerd[1628]: time="2025-09-04T15:46:19.244526422Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:19.245163 containerd[1628]: time="2025-09-04T15:46:19.244897804Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 8.484927417s" Sep 4 15:46:19.245163 containerd[1628]: time="2025-09-04T15:46:19.244916015Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 4 15:46:19.246473 containerd[1628]: time="2025-09-04T15:46:19.245963393Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 15:46:19.253083 containerd[1628]: time="2025-09-04T15:46:19.253046644Z" level=info msg="CreateContainer within sandbox \"4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 4 15:46:19.259893 containerd[1628]: time="2025-09-04T15:46:19.259854877Z" level=info msg="Container aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:19.289740 containerd[1628]: time="2025-09-04T15:46:19.289710307Z" level=info msg="CreateContainer within sandbox \"4fac510611fedcc732de96c753797546e36799bc80d490a1b93e22800a83c9b9\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718\"" Sep 4 15:46:19.290440 containerd[1628]: time="2025-09-04T15:46:19.290311728Z" level=info msg="StartContainer for \"aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718\"" Sep 4 15:46:19.290996 containerd[1628]: time="2025-09-04T15:46:19.290978986Z" level=info msg="connecting to shim aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718" address="unix:///run/containerd/s/187dd0cf640c40ce71bece50e81b4eece8ee49add80da0fd2c7c386ef204a44b" protocol=ttrpc version=3 Sep 4 15:46:19.361302 systemd[1]: Started cri-containerd-aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718.scope - libcontainer container aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718. Sep 4 15:46:19.416800 containerd[1628]: time="2025-09-04T15:46:19.416762323Z" level=info msg="StartContainer for \"aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718\" returns successfully" Sep 4 15:46:20.484681 kubelet[2941]: I0904 15:46:20.484631 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-vbl6h" podStartSLOduration=41.614163789 podStartE2EDuration="57.48461821s" podCreationTimestamp="2025-09-04 15:45:23 +0000 UTC" firstStartedPulling="2025-09-04 15:46:03.375443203 +0000 UTC m=+56.405134918" lastFinishedPulling="2025-09-04 15:46:19.245897622 +0000 UTC m=+72.275589339" observedRunningTime="2025-09-04 15:46:20.484342996 +0000 UTC m=+73.514034725" watchObservedRunningTime="2025-09-04 15:46:20.48461821 +0000 UTC m=+73.514309935" Sep 4 15:46:20.552764 containerd[1628]: time="2025-09-04T15:46:20.552727216Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718\" id:\"38e944b187c7e6118fb054a3a9ad3e2fadf874911e5a01e70f35f8fad228a1b0\" pid:5118 exit_status:1 exited_at:{seconds:1757000780 nanos:543546601}" Sep 4 15:46:21.500568 containerd[1628]: time="2025-09-04T15:46:21.500535784Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718\" id:\"24b5766bb34d18642dc8848ffcb538a4e00579ca91f4e47d4322e91172e5ba99\" pid:5141 exit_status:1 exited_at:{seconds:1757000781 nanos:500202716}" Sep 4 15:46:24.355280 systemd[1]: Started sshd@7-139.178.70.109:22-139.178.89.65:37762.service - OpenSSH per-connection server daemon (139.178.89.65:37762). Sep 4 15:46:24.572154 sshd[5155]: Accepted publickey for core from 139.178.89.65 port 37762 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:46:24.575904 sshd-session[5155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:46:24.584254 systemd-logind[1608]: New session 10 of user core. Sep 4 15:46:24.591441 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 4 15:46:25.606577 sshd[5158]: Connection closed by 139.178.89.65 port 37762 Sep 4 15:46:25.607040 sshd-session[5155]: pam_unix(sshd:session): session closed for user core Sep 4 15:46:25.619272 systemd[1]: sshd@7-139.178.70.109:22-139.178.89.65:37762.service: Deactivated successfully. Sep 4 15:46:25.621095 systemd[1]: session-10.scope: Deactivated successfully. Sep 4 15:46:25.622381 systemd-logind[1608]: Session 10 logged out. Waiting for processes to exit. Sep 4 15:46:25.623626 systemd-logind[1608]: Removed session 10. Sep 4 15:46:27.792089 containerd[1628]: time="2025-09-04T15:46:27.792052324Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:27.793631 containerd[1628]: time="2025-09-04T15:46:27.792852072Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 4 15:46:27.793776 containerd[1628]: time="2025-09-04T15:46:27.793756270Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:27.794992 containerd[1628]: time="2025-09-04T15:46:27.794956366Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:27.795584 containerd[1628]: time="2025-09-04T15:46:27.795503508Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 8.549507148s" Sep 4 15:46:27.795584 containerd[1628]: time="2025-09-04T15:46:27.795526316Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 15:46:27.803785 containerd[1628]: time="2025-09-04T15:46:27.803761172Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 4 15:46:27.828724 containerd[1628]: time="2025-09-04T15:46:27.828660377Z" level=info msg="CreateContainer within sandbox \"bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 15:46:27.845588 containerd[1628]: time="2025-09-04T15:46:27.845536625Z" level=info msg="Container c58491ae834852fa56fa7d4374ece3eabd8fa8a2ceed976308db2c3d5ce915ac: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:27.866825 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3967402312.mount: Deactivated successfully. Sep 4 15:46:27.877573 containerd[1628]: time="2025-09-04T15:46:27.877547292Z" level=info msg="CreateContainer within sandbox \"bd8b46ad645e02e70cb3b41ea0fe233c956aff2bd52d190427d29cc59d9c329d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c58491ae834852fa56fa7d4374ece3eabd8fa8a2ceed976308db2c3d5ce915ac\"" Sep 4 15:46:27.877990 containerd[1628]: time="2025-09-04T15:46:27.877977584Z" level=info msg="StartContainer for \"c58491ae834852fa56fa7d4374ece3eabd8fa8a2ceed976308db2c3d5ce915ac\"" Sep 4 15:46:27.880589 containerd[1628]: time="2025-09-04T15:46:27.880532534Z" level=info msg="connecting to shim c58491ae834852fa56fa7d4374ece3eabd8fa8a2ceed976308db2c3d5ce915ac" address="unix:///run/containerd/s/954d5e4560a559b9542031427ba98633fe1aa4f62be3212e2f509753697677eb" protocol=ttrpc version=3 Sep 4 15:46:27.992340 systemd[1]: Started cri-containerd-c58491ae834852fa56fa7d4374ece3eabd8fa8a2ceed976308db2c3d5ce915ac.scope - libcontainer container c58491ae834852fa56fa7d4374ece3eabd8fa8a2ceed976308db2c3d5ce915ac. Sep 4 15:46:28.107906 containerd[1628]: time="2025-09-04T15:46:28.107622581Z" level=info msg="StartContainer for \"c58491ae834852fa56fa7d4374ece3eabd8fa8a2ceed976308db2c3d5ce915ac\" returns successfully" Sep 4 15:46:28.552054 kubelet[2941]: I0904 15:46:28.551268 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5696449574-m4mxw" podStartSLOduration=43.284111958 podStartE2EDuration="1m7.538018532s" podCreationTimestamp="2025-09-04 15:45:21 +0000 UTC" firstStartedPulling="2025-09-04 15:46:03.549585745 +0000 UTC m=+56.579277467" lastFinishedPulling="2025-09-04 15:46:27.803492321 +0000 UTC m=+80.833184041" observedRunningTime="2025-09-04 15:46:28.500631266 +0000 UTC m=+81.530322985" watchObservedRunningTime="2025-09-04 15:46:28.538018532 +0000 UTC m=+81.567710255" Sep 4 15:46:29.597737 containerd[1628]: time="2025-09-04T15:46:29.597117653Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\" id:\"b22b2b4ac2c38d6db447a9c392b0ccafe9e89fa7073bd00f91816c5c33b0ef9c\" pid:5239 exited_at:{seconds:1757000789 nanos:590995605}" Sep 4 15:46:30.619600 systemd[1]: Started sshd@8-139.178.70.109:22-139.178.89.65:60088.service - OpenSSH per-connection server daemon (139.178.89.65:60088). Sep 4 15:46:31.074030 sshd[5252]: Accepted publickey for core from 139.178.89.65 port 60088 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:46:31.076337 sshd-session[5252]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:46:31.081298 systemd-logind[1608]: New session 11 of user core. Sep 4 15:46:31.091472 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 4 15:46:32.975496 sshd[5255]: Connection closed by 139.178.89.65 port 60088 Sep 4 15:46:32.976178 sshd-session[5252]: pam_unix(sshd:session): session closed for user core Sep 4 15:46:32.978800 systemd-logind[1608]: Session 11 logged out. Waiting for processes to exit. Sep 4 15:46:32.978942 systemd[1]: sshd@8-139.178.70.109:22-139.178.89.65:60088.service: Deactivated successfully. Sep 4 15:46:32.980934 systemd[1]: session-11.scope: Deactivated successfully. Sep 4 15:46:32.983079 systemd-logind[1608]: Removed session 11. Sep 4 15:46:37.544925 containerd[1628]: time="2025-09-04T15:46:37.544868796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:37.550850 containerd[1628]: time="2025-09-04T15:46:37.550830674Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 4 15:46:37.560741 containerd[1628]: time="2025-09-04T15:46:37.560721897Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:37.570618 containerd[1628]: time="2025-09-04T15:46:37.570601238Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:37.570992 containerd[1628]: time="2025-09-04T15:46:37.570975282Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 9.767191714s" Sep 4 15:46:37.578886 containerd[1628]: time="2025-09-04T15:46:37.570993712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 4 15:46:37.578886 containerd[1628]: time="2025-09-04T15:46:37.571613803Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 15:46:37.664547 containerd[1628]: time="2025-09-04T15:46:37.664519346Z" level=info msg="CreateContainer within sandbox \"a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 4 15:46:37.747426 containerd[1628]: time="2025-09-04T15:46:37.747392099Z" level=info msg="Container a040be4824299a20bf7bdd0d2be24bffc5748e30cc134cf251d3111a22e6223b: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:37.750440 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3085194050.mount: Deactivated successfully. Sep 4 15:46:37.800963 containerd[1628]: time="2025-09-04T15:46:37.800867453Z" level=info msg="CreateContainer within sandbox \"a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a040be4824299a20bf7bdd0d2be24bffc5748e30cc134cf251d3111a22e6223b\"" Sep 4 15:46:37.839760 containerd[1628]: time="2025-09-04T15:46:37.839730433Z" level=info msg="StartContainer for \"a040be4824299a20bf7bdd0d2be24bffc5748e30cc134cf251d3111a22e6223b\"" Sep 4 15:46:37.848001 containerd[1628]: time="2025-09-04T15:46:37.847965355Z" level=info msg="connecting to shim a040be4824299a20bf7bdd0d2be24bffc5748e30cc134cf251d3111a22e6223b" address="unix:///run/containerd/s/6a3c62dba4fccb4f65950122ad166644f890f13585bb59078f7e4ed8e94ff152" protocol=ttrpc version=3 Sep 4 15:46:37.868726 systemd[1]: Started cri-containerd-a040be4824299a20bf7bdd0d2be24bffc5748e30cc134cf251d3111a22e6223b.scope - libcontainer container a040be4824299a20bf7bdd0d2be24bffc5748e30cc134cf251d3111a22e6223b. Sep 4 15:46:37.909138 containerd[1628]: time="2025-09-04T15:46:37.908888432Z" level=info msg="StartContainer for \"a040be4824299a20bf7bdd0d2be24bffc5748e30cc134cf251d3111a22e6223b\" returns successfully" Sep 4 15:46:37.986009 systemd[1]: Started sshd@9-139.178.70.109:22-139.178.89.65:60102.service - OpenSSH per-connection server daemon (139.178.89.65:60102). Sep 4 15:46:38.152699 containerd[1628]: time="2025-09-04T15:46:38.152621597Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:38.158326 containerd[1628]: time="2025-09-04T15:46:38.158295442Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 4 15:46:38.159396 containerd[1628]: time="2025-09-04T15:46:38.159380017Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 587.701505ms" Sep 4 15:46:38.159445 containerd[1628]: time="2025-09-04T15:46:38.159401538Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 15:46:38.160112 containerd[1628]: time="2025-09-04T15:46:38.159969797Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 4 15:46:38.177924 containerd[1628]: time="2025-09-04T15:46:38.177882261Z" level=info msg="CreateContainer within sandbox \"515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 15:46:38.231753 containerd[1628]: time="2025-09-04T15:46:38.231645307Z" level=info msg="Container ab771e0a507f1cf0870f60afe64c6931f816c19aa8d2622634e655eeb6fb2322: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:38.234779 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2388389886.mount: Deactivated successfully. Sep 4 15:46:38.256175 containerd[1628]: time="2025-09-04T15:46:38.256142994Z" level=info msg="CreateContainer within sandbox \"515523c132f5fbf91dcd82a4f5160f0d3c948bc61d57fac8dc0d16464b5ee0d5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ab771e0a507f1cf0870f60afe64c6931f816c19aa8d2622634e655eeb6fb2322\"" Sep 4 15:46:38.257059 containerd[1628]: time="2025-09-04T15:46:38.256552313Z" level=info msg="StartContainer for \"ab771e0a507f1cf0870f60afe64c6931f816c19aa8d2622634e655eeb6fb2322\"" Sep 4 15:46:38.258282 containerd[1628]: time="2025-09-04T15:46:38.258257852Z" level=info msg="connecting to shim ab771e0a507f1cf0870f60afe64c6931f816c19aa8d2622634e655eeb6fb2322" address="unix:///run/containerd/s/7e59ff705fa130e5b9b69e55c6d9c7b7ec49e867bfd2a60b8e96f55e33ea2c25" protocol=ttrpc version=3 Sep 4 15:46:38.279373 systemd[1]: Started cri-containerd-ab771e0a507f1cf0870f60afe64c6931f816c19aa8d2622634e655eeb6fb2322.scope - libcontainer container ab771e0a507f1cf0870f60afe64c6931f816c19aa8d2622634e655eeb6fb2322. Sep 4 15:46:38.301484 sshd[5305]: Accepted publickey for core from 139.178.89.65 port 60102 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:46:38.304370 sshd-session[5305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:46:38.307908 systemd-logind[1608]: New session 12 of user core. Sep 4 15:46:38.313565 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 4 15:46:38.333734 containerd[1628]: time="2025-09-04T15:46:38.333701058Z" level=info msg="StartContainer for \"ab771e0a507f1cf0870f60afe64c6931f816c19aa8d2622634e655eeb6fb2322\" returns successfully" Sep 4 15:46:38.670976 kubelet[2941]: I0904 15:46:38.624199 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5696449574-g89ks" podStartSLOduration=44.764218502 podStartE2EDuration="1m17.58779468s" podCreationTimestamp="2025-09-04 15:45:21 +0000 UTC" firstStartedPulling="2025-09-04 15:46:05.336342641 +0000 UTC m=+58.366034357" lastFinishedPulling="2025-09-04 15:46:38.159918818 +0000 UTC m=+91.189610535" observedRunningTime="2025-09-04 15:46:38.578605249 +0000 UTC m=+91.608296984" watchObservedRunningTime="2025-09-04 15:46:38.58779468 +0000 UTC m=+91.617486408" Sep 4 15:46:39.430229 sshd[5327]: Connection closed by 139.178.89.65 port 60102 Sep 4 15:46:39.430600 sshd-session[5305]: pam_unix(sshd:session): session closed for user core Sep 4 15:46:39.433784 systemd[1]: sshd@9-139.178.70.109:22-139.178.89.65:60102.service: Deactivated successfully. Sep 4 15:46:39.435742 systemd[1]: session-12.scope: Deactivated successfully. Sep 4 15:46:39.443078 systemd-logind[1608]: Session 12 logged out. Waiting for processes to exit. Sep 4 15:46:39.444032 systemd-logind[1608]: Removed session 12. Sep 4 15:46:44.444102 systemd[1]: Started sshd@10-139.178.70.109:22-139.178.89.65:36808.service - OpenSSH per-connection server daemon (139.178.89.65:36808). Sep 4 15:46:45.689039 sshd[5369]: Accepted publickey for core from 139.178.89.65 port 36808 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:46:45.727709 sshd-session[5369]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:46:45.741062 systemd-logind[1608]: New session 13 of user core. Sep 4 15:46:45.744409 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 4 15:46:46.620503 containerd[1628]: time="2025-09-04T15:46:46.620315461Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:46.672669 containerd[1628]: time="2025-09-04T15:46:46.672561797Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 4 15:46:46.707472 containerd[1628]: time="2025-09-04T15:46:46.707421598Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:46.733860 containerd[1628]: time="2025-09-04T15:46:46.733804984Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:46.741697 containerd[1628]: time="2025-09-04T15:46:46.741666254Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 8.574713612s" Sep 4 15:46:46.741697 containerd[1628]: time="2025-09-04T15:46:46.741697889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 4 15:46:46.791782 containerd[1628]: time="2025-09-04T15:46:46.791753645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 4 15:46:47.318131 containerd[1628]: time="2025-09-04T15:46:47.318100637Z" level=info msg="CreateContainer within sandbox \"30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 4 15:46:47.443628 containerd[1628]: time="2025-09-04T15:46:47.443481552Z" level=info msg="Container 6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:47.448946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1441767624.mount: Deactivated successfully. Sep 4 15:46:47.544377 containerd[1628]: time="2025-09-04T15:46:47.544347465Z" level=info msg="CreateContainer within sandbox \"30db92d8bb15f23e879f02f848cb516606b512f844e7a49a15cc336f63e10d73\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7\"" Sep 4 15:46:47.548244 containerd[1628]: time="2025-09-04T15:46:47.548194988Z" level=info msg="StartContainer for \"6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7\"" Sep 4 15:46:47.567523 containerd[1628]: time="2025-09-04T15:46:47.567394420Z" level=info msg="connecting to shim 6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7" address="unix:///run/containerd/s/3afa6fcea3dd375cbd401f1866576371ccd97c8cf9774b6abae537109984d6f1" protocol=ttrpc version=3 Sep 4 15:46:47.584921 sshd[5379]: Connection closed by 139.178.89.65 port 36808 Sep 4 15:46:47.583974 sshd-session[5369]: pam_unix(sshd:session): session closed for user core Sep 4 15:46:47.593909 systemd[1]: sshd@10-139.178.70.109:22-139.178.89.65:36808.service: Deactivated successfully. Sep 4 15:46:47.595253 systemd[1]: session-13.scope: Deactivated successfully. Sep 4 15:46:47.595787 systemd-logind[1608]: Session 13 logged out. Waiting for processes to exit. Sep 4 15:46:47.598965 systemd[1]: Started sshd@11-139.178.70.109:22-139.178.89.65:36816.service - OpenSSH per-connection server daemon (139.178.89.65:36816). Sep 4 15:46:47.599827 systemd-logind[1608]: Removed session 13. Sep 4 15:46:47.662025 sshd[5397]: Accepted publickey for core from 139.178.89.65 port 36816 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:46:47.662752 sshd-session[5397]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:46:47.665445 systemd-logind[1608]: New session 14 of user core. Sep 4 15:46:47.671304 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 4 15:46:47.722331 systemd[1]: Started cri-containerd-6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7.scope - libcontainer container 6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7. Sep 4 15:46:47.829031 containerd[1628]: time="2025-09-04T15:46:47.828935482Z" level=info msg="StartContainer for \"6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7\" returns successfully" Sep 4 15:46:48.690680 sshd[5400]: Connection closed by 139.178.89.65 port 36816 Sep 4 15:46:48.701773 systemd[1]: Started sshd@12-139.178.70.109:22-139.178.89.65:36818.service - OpenSSH per-connection server daemon (139.178.89.65:36818). Sep 4 15:46:48.692031 sshd-session[5397]: pam_unix(sshd:session): session closed for user core Sep 4 15:46:48.702997 systemd[1]: sshd@11-139.178.70.109:22-139.178.89.65:36816.service: Deactivated successfully. Sep 4 15:46:48.706190 systemd[1]: session-14.scope: Deactivated successfully. Sep 4 15:46:48.713082 systemd-logind[1608]: Session 14 logged out. Waiting for processes to exit. Sep 4 15:46:48.714670 systemd-logind[1608]: Removed session 14. Sep 4 15:46:48.800109 sshd[5445]: Accepted publickey for core from 139.178.89.65 port 36818 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:46:48.800648 sshd-session[5445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:46:48.809072 systemd-logind[1608]: New session 15 of user core. Sep 4 15:46:48.812438 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 4 15:46:49.033088 kubelet[2941]: I0904 15:46:49.030868 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5bc74776b8-jcx9h" podStartSLOduration=44.54682138 podStartE2EDuration="1m24.979374213s" podCreationTimestamp="2025-09-04 15:45:24 +0000 UTC" firstStartedPulling="2025-09-04 15:46:06.349166876 +0000 UTC m=+59.378858592" lastFinishedPulling="2025-09-04 15:46:46.781719704 +0000 UTC m=+99.811411425" observedRunningTime="2025-09-04 15:46:48.941939836 +0000 UTC m=+101.971631559" watchObservedRunningTime="2025-09-04 15:46:48.979374213 +0000 UTC m=+102.009065935" Sep 4 15:46:49.102339 sshd[5451]: Connection closed by 139.178.89.65 port 36818 Sep 4 15:46:49.102738 sshd-session[5445]: pam_unix(sshd:session): session closed for user core Sep 4 15:46:49.107312 systemd[1]: sshd@12-139.178.70.109:22-139.178.89.65:36818.service: Deactivated successfully. Sep 4 15:46:49.108503 systemd[1]: session-15.scope: Deactivated successfully. Sep 4 15:46:49.109090 systemd-logind[1608]: Session 15 logged out. Waiting for processes to exit. Sep 4 15:46:49.110529 systemd-logind[1608]: Removed session 15. Sep 4 15:46:49.775694 containerd[1628]: time="2025-09-04T15:46:49.775660612Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7\" id:\"f9b50f1e5e45c485e75a8a0bc855a36882304de8cc0d76a990a69b52bb168322\" pid:5476 exited_at:{seconds:1757000809 nanos:760668233}" Sep 4 15:46:51.754414 containerd[1628]: time="2025-09-04T15:46:51.754384917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718\" id:\"ff585bdd7e6ce0056e3b1971716d44b7ad23120729abd4840dfe3e1f0777f000\" pid:5497 exited_at:{seconds:1757000811 nanos:753996088}" Sep 4 15:46:54.112379 systemd[1]: Started sshd@13-139.178.70.109:22-139.178.89.65:47978.service - OpenSSH per-connection server daemon (139.178.89.65:47978). Sep 4 15:46:54.258577 sshd[5509]: Accepted publickey for core from 139.178.89.65 port 47978 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:46:54.280614 sshd-session[5509]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:46:54.287670 systemd-logind[1608]: New session 16 of user core. Sep 4 15:46:54.290307 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 4 15:46:54.717173 sshd[5512]: Connection closed by 139.178.89.65 port 47978 Sep 4 15:46:54.719709 systemd[1]: sshd@13-139.178.70.109:22-139.178.89.65:47978.service: Deactivated successfully. Sep 4 15:46:54.717497 sshd-session[5509]: pam_unix(sshd:session): session closed for user core Sep 4 15:46:54.721507 systemd[1]: session-16.scope: Deactivated successfully. Sep 4 15:46:54.722489 systemd-logind[1608]: Session 16 logged out. Waiting for processes to exit. Sep 4 15:46:54.724618 systemd-logind[1608]: Removed session 16. Sep 4 15:46:57.424138 containerd[1628]: time="2025-09-04T15:46:57.424104740Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7\" id:\"d0bc2e4146e1d08c934f6a25364910285531fad829e2e58d7d3d623fa3e8b328\" pid:5535 exited_at:{seconds:1757000817 nanos:423807479}" Sep 4 15:46:58.157733 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3868388264.mount: Deactivated successfully. Sep 4 15:46:58.508373 containerd[1628]: time="2025-09-04T15:46:58.508154998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:58.520109 containerd[1628]: time="2025-09-04T15:46:58.519979980Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 4 15:46:58.531798 containerd[1628]: time="2025-09-04T15:46:58.531599404Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:58.548114 containerd[1628]: time="2025-09-04T15:46:58.548086418Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:46:58.549191 containerd[1628]: time="2025-09-04T15:46:58.549109771Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 11.757330787s" Sep 4 15:46:58.549191 containerd[1628]: time="2025-09-04T15:46:58.549131847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 4 15:46:58.775818 containerd[1628]: time="2025-09-04T15:46:58.775617708Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 4 15:46:59.159426 containerd[1628]: time="2025-09-04T15:46:59.159293147Z" level=info msg="CreateContainer within sandbox \"5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 4 15:46:59.177546 containerd[1628]: time="2025-09-04T15:46:59.177384185Z" level=info msg="Container 5b42173acdb6ff197ebbc72adf96b009d6e472ed387e0086dba54f4a979693ea: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:46:59.294567 containerd[1628]: time="2025-09-04T15:46:59.294539989Z" level=info msg="CreateContainer within sandbox \"5723ad158e83b9a82ab9af0fa4c031256111656fba9dbc92caf9232aa7545fba\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"5b42173acdb6ff197ebbc72adf96b009d6e472ed387e0086dba54f4a979693ea\"" Sep 4 15:46:59.302003 containerd[1628]: time="2025-09-04T15:46:59.301974425Z" level=info msg="StartContainer for \"5b42173acdb6ff197ebbc72adf96b009d6e472ed387e0086dba54f4a979693ea\"" Sep 4 15:46:59.312970 containerd[1628]: time="2025-09-04T15:46:59.312920165Z" level=info msg="connecting to shim 5b42173acdb6ff197ebbc72adf96b009d6e472ed387e0086dba54f4a979693ea" address="unix:///run/containerd/s/45a0127192281221f4cb99275a2fe6a1af30cd183050c721a04f9ac8d37a3805" protocol=ttrpc version=3 Sep 4 15:46:59.396377 systemd[1]: Started cri-containerd-5b42173acdb6ff197ebbc72adf96b009d6e472ed387e0086dba54f4a979693ea.scope - libcontainer container 5b42173acdb6ff197ebbc72adf96b009d6e472ed387e0086dba54f4a979693ea. Sep 4 15:46:59.476435 containerd[1628]: time="2025-09-04T15:46:59.476406206Z" level=info msg="StartContainer for \"5b42173acdb6ff197ebbc72adf96b009d6e472ed387e0086dba54f4a979693ea\" returns successfully" Sep 4 15:46:59.749061 systemd[1]: Started sshd@14-139.178.70.109:22-139.178.89.65:47992.service - OpenSSH per-connection server daemon (139.178.89.65:47992). Sep 4 15:46:59.970810 sshd[5635]: Accepted publickey for core from 139.178.89.65 port 47992 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:46:59.977731 sshd-session[5635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:46:59.990201 systemd-logind[1608]: New session 17 of user core. Sep 4 15:46:59.996679 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 4 15:47:00.780286 containerd[1628]: time="2025-09-04T15:47:00.780230769Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\" id:\"04712b3fcf7c565f1432b4a3b49a9ac83aaa2d8a5bf6dcf30430a16a2ab560dd\" pid:5568 exit_status:1 exited_at:{seconds:1757000820 nanos:752112904}" Sep 4 15:47:00.802670 containerd[1628]: time="2025-09-04T15:47:00.802478366Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718\" id:\"37908cdd585f1cc925a83d4b358ffb143f9512f7a1a6471b743a77e4ac4ce5e5\" pid:5614 exited_at:{seconds:1757000820 nanos:801369119}" Sep 4 15:47:01.347086 kubelet[2941]: I0904 15:47:01.267841 2941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-64dbf6f99d-5tw6p" podStartSLOduration=4.683867922 podStartE2EDuration="1m2.94709898s" podCreationTimestamp="2025-09-04 15:45:58 +0000 UTC" firstStartedPulling="2025-09-04 15:46:00.467748909 +0000 UTC m=+53.497440630" lastFinishedPulling="2025-09-04 15:46:58.730979963 +0000 UTC m=+111.760671688" observedRunningTime="2025-09-04 15:47:00.627808303 +0000 UTC m=+113.657500032" watchObservedRunningTime="2025-09-04 15:47:00.94709898 +0000 UTC m=+113.976790709" Sep 4 15:47:02.039886 sshd[5638]: Connection closed by 139.178.89.65 port 47992 Sep 4 15:47:02.045762 sshd-session[5635]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:02.056529 systemd[1]: sshd@14-139.178.70.109:22-139.178.89.65:47992.service: Deactivated successfully. Sep 4 15:47:02.059691 systemd[1]: session-17.scope: Deactivated successfully. Sep 4 15:47:02.062210 systemd-logind[1608]: Session 17 logged out. Waiting for processes to exit. Sep 4 15:47:02.063661 systemd-logind[1608]: Removed session 17. Sep 4 15:47:07.072239 systemd[1]: Started sshd@15-139.178.70.109:22-139.178.89.65:45550.service - OpenSSH per-connection server daemon (139.178.89.65:45550). Sep 4 15:47:07.273420 sshd[5658]: Accepted publickey for core from 139.178.89.65 port 45550 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:07.285089 sshd-session[5658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:07.299867 systemd-logind[1608]: New session 18 of user core. Sep 4 15:47:07.305332 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 4 15:47:07.545594 sshd[5663]: Connection closed by 139.178.89.65 port 45550 Sep 4 15:47:07.546247 sshd-session[5658]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:07.548615 systemd[1]: sshd@15-139.178.70.109:22-139.178.89.65:45550.service: Deactivated successfully. Sep 4 15:47:07.550025 systemd[1]: session-18.scope: Deactivated successfully. Sep 4 15:47:07.550707 systemd-logind[1608]: Session 18 logged out. Waiting for processes to exit. Sep 4 15:47:07.552125 systemd-logind[1608]: Removed session 18. Sep 4 15:47:12.561849 systemd[1]: Started sshd@16-139.178.70.109:22-139.178.89.65:60240.service - OpenSSH per-connection server daemon (139.178.89.65:60240). Sep 4 15:47:12.633592 sshd[5677]: Accepted publickey for core from 139.178.89.65 port 60240 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:12.634634 sshd-session[5677]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:12.638710 systemd-logind[1608]: New session 19 of user core. Sep 4 15:47:12.647421 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 4 15:47:12.868059 sshd[5680]: Connection closed by 139.178.89.65 port 60240 Sep 4 15:47:12.868582 sshd-session[5677]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:12.872618 systemd[1]: sshd@16-139.178.70.109:22-139.178.89.65:60240.service: Deactivated successfully. Sep 4 15:47:12.874126 systemd[1]: session-19.scope: Deactivated successfully. Sep 4 15:47:12.875496 systemd-logind[1608]: Session 19 logged out. Waiting for processes to exit. Sep 4 15:47:12.877264 systemd-logind[1608]: Removed session 19. Sep 4 15:47:15.370803 containerd[1628]: time="2025-09-04T15:47:15.370580687Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 4 15:47:15.371991 containerd[1628]: time="2025-09-04T15:47:15.371806812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:47:15.379906 containerd[1628]: time="2025-09-04T15:47:15.379878266Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:47:15.382842 containerd[1628]: time="2025-09-04T15:47:15.382819226Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:47:15.384263 containerd[1628]: time="2025-09-04T15:47:15.384243159Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 16.607023168s" Sep 4 15:47:15.384804 containerd[1628]: time="2025-09-04T15:47:15.384265364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 4 15:47:15.482033 containerd[1628]: time="2025-09-04T15:47:15.481956772Z" level=info msg="CreateContainer within sandbox \"a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 4 15:47:15.499842 containerd[1628]: time="2025-09-04T15:47:15.499718300Z" level=info msg="Container ad826d0fba9f82c253687dc10e4a5f42d306d10b375c400cf1d0546cea3286bf: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:47:15.553262 containerd[1628]: time="2025-09-04T15:47:15.553233197Z" level=info msg="CreateContainer within sandbox \"a05618bb495a0ece0a4f864e0d7c318400eb3a55ace8c3c668ff19b3c81bb39f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"ad826d0fba9f82c253687dc10e4a5f42d306d10b375c400cf1d0546cea3286bf\"" Sep 4 15:47:15.554573 containerd[1628]: time="2025-09-04T15:47:15.553577052Z" level=info msg="StartContainer for \"ad826d0fba9f82c253687dc10e4a5f42d306d10b375c400cf1d0546cea3286bf\"" Sep 4 15:47:15.561200 containerd[1628]: time="2025-09-04T15:47:15.561175724Z" level=info msg="connecting to shim ad826d0fba9f82c253687dc10e4a5f42d306d10b375c400cf1d0546cea3286bf" address="unix:///run/containerd/s/6a3c62dba4fccb4f65950122ad166644f890f13585bb59078f7e4ed8e94ff152" protocol=ttrpc version=3 Sep 4 15:47:15.678344 systemd[1]: Started cri-containerd-ad826d0fba9f82c253687dc10e4a5f42d306d10b375c400cf1d0546cea3286bf.scope - libcontainer container ad826d0fba9f82c253687dc10e4a5f42d306d10b375c400cf1d0546cea3286bf. Sep 4 15:47:15.757186 containerd[1628]: time="2025-09-04T15:47:15.757142724Z" level=info msg="StartContainer for \"ad826d0fba9f82c253687dc10e4a5f42d306d10b375c400cf1d0546cea3286bf\" returns successfully" Sep 4 15:47:16.476970 kubelet[2941]: I0904 15:47:16.474809 2941 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 4 15:47:16.476970 kubelet[2941]: I0904 15:47:16.476851 2941 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 4 15:47:17.878452 systemd[1]: Started sshd@17-139.178.70.109:22-139.178.89.65:60242.service - OpenSSH per-connection server daemon (139.178.89.65:60242). Sep 4 15:47:18.057331 sshd[5732]: Accepted publickey for core from 139.178.89.65 port 60242 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:18.059507 sshd-session[5732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:18.065681 systemd-logind[1608]: New session 20 of user core. Sep 4 15:47:18.073311 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 4 15:47:18.503690 sshd[5735]: Connection closed by 139.178.89.65 port 60242 Sep 4 15:47:18.503345 sshd-session[5732]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:18.505859 systemd[1]: sshd@17-139.178.70.109:22-139.178.89.65:60242.service: Deactivated successfully. Sep 4 15:47:18.507322 systemd[1]: session-20.scope: Deactivated successfully. Sep 4 15:47:18.507907 systemd-logind[1608]: Session 20 logged out. Waiting for processes to exit. Sep 4 15:47:18.508687 systemd-logind[1608]: Removed session 20. Sep 4 15:47:19.750595 containerd[1628]: time="2025-09-04T15:47:19.750564688Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e6b30542af47fd473788b40328795aabe78598470a776ca86c8c002221cadc7\" id:\"0f8483e4f08df26f1031d38a65a20414630cd4fc075ec3253f22ccbc7960b42c\" pid:5758 exited_at:{seconds:1757000839 nanos:746362124}" Sep 4 15:47:21.836640 containerd[1628]: time="2025-09-04T15:47:21.836607580Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aba84d60aaf74a3bcd8fbccc7fa72570a016d00a138d73aada6d083e4ab8c718\" id:\"e1776d1ee6b0af87bbfd3161ee9868dee6d70cd11985ca441aa5fdeaa01d779b\" pid:5785 exited_at:{seconds:1757000841 nanos:836144601}" Sep 4 15:47:23.513486 systemd[1]: Started sshd@18-139.178.70.109:22-139.178.89.65:46256.service - OpenSSH per-connection server daemon (139.178.89.65:46256). Sep 4 15:47:23.573964 sshd[5796]: Accepted publickey for core from 139.178.89.65 port 46256 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:23.575063 sshd-session[5796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:23.578690 systemd-logind[1608]: New session 21 of user core. Sep 4 15:47:23.588484 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 4 15:47:23.882568 sshd[5799]: Connection closed by 139.178.89.65 port 46256 Sep 4 15:47:23.883069 sshd-session[5796]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:23.885547 systemd[1]: sshd@18-139.178.70.109:22-139.178.89.65:46256.service: Deactivated successfully. Sep 4 15:47:23.886547 systemd[1]: session-21.scope: Deactivated successfully. Sep 4 15:47:23.887292 systemd-logind[1608]: Session 21 logged out. Waiting for processes to exit. Sep 4 15:47:23.888276 systemd-logind[1608]: Removed session 21. Sep 4 15:47:28.907746 systemd[1]: Started sshd@19-139.178.70.109:22-139.178.89.65:46270.service - OpenSSH per-connection server daemon (139.178.89.65:46270). Sep 4 15:47:29.000390 sshd[5819]: Accepted publickey for core from 139.178.89.65 port 46270 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:29.001336 sshd-session[5819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:29.005528 systemd-logind[1608]: New session 22 of user core. Sep 4 15:47:29.009382 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 4 15:47:29.792311 containerd[1628]: time="2025-09-04T15:47:29.792181302Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c4c03986e7fa0e5c4db45fb08894988ab119e760c7350f55eaaa6a8a02cbfef\" id:\"5d9471f2d052bcc3dfbaf9ff3c1d43ad632a03cbb2d96e09c380365286d61555\" pid:5841 exited_at:{seconds:1757000849 nanos:791996947}" Sep 4 15:47:30.186898 sshd[5822]: Connection closed by 139.178.89.65 port 46270 Sep 4 15:47:30.189796 sshd-session[5819]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:30.196705 systemd[1]: Started sshd@20-139.178.70.109:22-139.178.89.65:38990.service - OpenSSH per-connection server daemon (139.178.89.65:38990). Sep 4 15:47:30.200958 systemd[1]: sshd@19-139.178.70.109:22-139.178.89.65:46270.service: Deactivated successfully. Sep 4 15:47:30.202053 systemd[1]: session-22.scope: Deactivated successfully. Sep 4 15:47:30.202991 systemd-logind[1608]: Session 22 logged out. Waiting for processes to exit. Sep 4 15:47:30.204052 systemd-logind[1608]: Removed session 22. Sep 4 15:47:30.272639 sshd[5866]: Accepted publickey for core from 139.178.89.65 port 38990 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:30.290499 sshd-session[5866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:30.298065 systemd-logind[1608]: New session 23 of user core. Sep 4 15:47:30.303326 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 4 15:47:30.741332 sshd[5872]: Connection closed by 139.178.89.65 port 38990 Sep 4 15:47:30.743364 sshd-session[5866]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:30.748847 systemd[1]: sshd@20-139.178.70.109:22-139.178.89.65:38990.service: Deactivated successfully. Sep 4 15:47:30.750657 systemd[1]: session-23.scope: Deactivated successfully. Sep 4 15:47:30.752970 systemd-logind[1608]: Session 23 logged out. Waiting for processes to exit. Sep 4 15:47:30.754745 systemd[1]: Started sshd@21-139.178.70.109:22-139.178.89.65:39002.service - OpenSSH per-connection server daemon (139.178.89.65:39002). Sep 4 15:47:30.755258 systemd-logind[1608]: Removed session 23. Sep 4 15:47:30.819907 sshd[5890]: Accepted publickey for core from 139.178.89.65 port 39002 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:30.820636 sshd-session[5890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:30.823166 systemd-logind[1608]: New session 24 of user core. Sep 4 15:47:30.826298 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 4 15:47:31.841731 sshd[5893]: Connection closed by 139.178.89.65 port 39002 Sep 4 15:47:31.842311 sshd-session[5890]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:31.849698 systemd[1]: sshd@21-139.178.70.109:22-139.178.89.65:39002.service: Deactivated successfully. Sep 4 15:47:31.851396 systemd[1]: session-24.scope: Deactivated successfully. Sep 4 15:47:31.852995 systemd-logind[1608]: Session 24 logged out. Waiting for processes to exit. Sep 4 15:47:31.854596 systemd[1]: Started sshd@22-139.178.70.109:22-139.178.89.65:39004.service - OpenSSH per-connection server daemon (139.178.89.65:39004). Sep 4 15:47:31.857598 systemd-logind[1608]: Removed session 24. Sep 4 15:47:31.963809 sshd[5908]: Accepted publickey for core from 139.178.89.65 port 39004 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:31.964909 sshd-session[5908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:31.970365 systemd-logind[1608]: New session 25 of user core. Sep 4 15:47:31.973316 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 4 15:47:33.133100 sshd[5915]: Connection closed by 139.178.89.65 port 39004 Sep 4 15:47:33.145307 systemd[1]: Started sshd@23-139.178.70.109:22-139.178.89.65:39010.service - OpenSSH per-connection server daemon (139.178.89.65:39010). Sep 4 15:47:33.142177 sshd-session[5908]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:33.145590 systemd[1]: sshd@22-139.178.70.109:22-139.178.89.65:39004.service: Deactivated successfully. Sep 4 15:47:33.146631 systemd[1]: session-25.scope: Deactivated successfully. Sep 4 15:47:33.147577 systemd-logind[1608]: Session 25 logged out. Waiting for processes to exit. Sep 4 15:47:33.148715 systemd-logind[1608]: Removed session 25. Sep 4 15:47:33.488681 sshd[5924]: Accepted publickey for core from 139.178.89.65 port 39010 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:33.490365 sshd-session[5924]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:33.494105 systemd-logind[1608]: New session 26 of user core. Sep 4 15:47:33.499402 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 4 15:47:33.957134 sshd[5931]: Connection closed by 139.178.89.65 port 39010 Sep 4 15:47:33.959987 systemd[1]: sshd@23-139.178.70.109:22-139.178.89.65:39010.service: Deactivated successfully. Sep 4 15:47:33.957362 sshd-session[5924]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:33.961164 systemd[1]: session-26.scope: Deactivated successfully. Sep 4 15:47:33.962058 systemd-logind[1608]: Session 26 logged out. Waiting for processes to exit. Sep 4 15:47:33.963069 systemd-logind[1608]: Removed session 26. Sep 4 15:47:38.978526 systemd[1]: Started sshd@24-139.178.70.109:22-139.178.89.65:39016.service - OpenSSH per-connection server daemon (139.178.89.65:39016). Sep 4 15:47:39.114757 sshd[5957]: Accepted publickey for core from 139.178.89.65 port 39016 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:39.116583 sshd-session[5957]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:39.121176 systemd-logind[1608]: New session 27 of user core. Sep 4 15:47:39.127406 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 4 15:47:40.150166 sshd[5962]: Connection closed by 139.178.89.65 port 39016 Sep 4 15:47:40.153095 systemd[1]: sshd@24-139.178.70.109:22-139.178.89.65:39016.service: Deactivated successfully. Sep 4 15:47:40.151230 sshd-session[5957]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:40.154393 systemd[1]: session-27.scope: Deactivated successfully. Sep 4 15:47:40.158331 systemd-logind[1608]: Session 27 logged out. Waiting for processes to exit. Sep 4 15:47:40.159787 systemd-logind[1608]: Removed session 27. Sep 4 15:47:45.178292 systemd[1]: Started sshd@25-139.178.70.109:22-139.178.89.65:43026.service - OpenSSH per-connection server daemon (139.178.89.65:43026). Sep 4 15:47:45.288695 sshd[5977]: Accepted publickey for core from 139.178.89.65 port 43026 ssh2: RSA SHA256:Zdt4Lmd4UWBoVTYX6iY9bREAQ4tI+ZqegShvHWyXtVs Sep 4 15:47:45.293041 sshd-session[5977]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:47:45.296415 systemd-logind[1608]: New session 28 of user core. Sep 4 15:47:45.300348 systemd[1]: Started session-28.scope - Session 28 of User core. Sep 4 15:47:46.117651 sshd[5980]: Connection closed by 139.178.89.65 port 43026 Sep 4 15:47:46.117582 sshd-session[5977]: pam_unix(sshd:session): session closed for user core Sep 4 15:47:46.122496 systemd[1]: sshd@25-139.178.70.109:22-139.178.89.65:43026.service: Deactivated successfully. Sep 4 15:47:46.123987 systemd[1]: session-28.scope: Deactivated successfully. Sep 4 15:47:46.126274 systemd-logind[1608]: Session 28 logged out. Waiting for processes to exit. Sep 4 15:47:46.127900 systemd-logind[1608]: Removed session 28.