Sep 5 06:03:09.731471 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 5 04:19:33 -00 2025 Sep 5 06:03:09.731488 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=4b2174e9c368fa97600991ce20efc370fbbf3ddfce3ea407f50212a1021bd496 Sep 5 06:03:09.731495 kernel: Disabled fast string operations Sep 5 06:03:09.731499 kernel: BIOS-provided physical RAM map: Sep 5 06:03:09.731503 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Sep 5 06:03:09.731507 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Sep 5 06:03:09.731513 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Sep 5 06:03:09.731517 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Sep 5 06:03:09.731521 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Sep 5 06:03:09.731526 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Sep 5 06:03:09.731530 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Sep 5 06:03:09.731534 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Sep 5 06:03:09.731538 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Sep 5 06:03:09.731543 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 5 06:03:09.731549 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Sep 5 06:03:09.731554 kernel: NX (Execute Disable) protection: active Sep 5 06:03:09.731573 kernel: APIC: Static calls initialized Sep 5 06:03:09.731578 kernel: SMBIOS 2.7 present. Sep 5 06:03:09.731583 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Sep 5 06:03:09.733578 kernel: DMI: Memory slots populated: 1/128 Sep 5 06:03:09.733591 kernel: vmware: hypercall mode: 0x00 Sep 5 06:03:09.733600 kernel: Hypervisor detected: VMware Sep 5 06:03:09.733608 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Sep 5 06:03:09.733615 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Sep 5 06:03:09.733620 kernel: vmware: using clock offset of 3566088543 ns Sep 5 06:03:09.733625 kernel: tsc: Detected 3408.000 MHz processor Sep 5 06:03:09.733631 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 5 06:03:09.733636 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 5 06:03:09.733641 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Sep 5 06:03:09.733646 kernel: total RAM covered: 3072M Sep 5 06:03:09.733653 kernel: Found optimal setting for mtrr clean up Sep 5 06:03:09.733660 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Sep 5 06:03:09.733668 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Sep 5 06:03:09.733673 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 5 06:03:09.733678 kernel: Using GB pages for direct mapping Sep 5 06:03:09.733683 kernel: ACPI: Early table checksum verification disabled Sep 5 06:03:09.733688 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Sep 5 06:03:09.733693 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Sep 5 06:03:09.733698 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Sep 5 06:03:09.733705 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Sep 5 06:03:09.733711 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 5 06:03:09.733716 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 5 06:03:09.733722 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Sep 5 06:03:09.733727 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Sep 5 06:03:09.733733 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Sep 5 06:03:09.733739 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Sep 5 06:03:09.733744 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Sep 5 06:03:09.733749 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Sep 5 06:03:09.733754 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Sep 5 06:03:09.733760 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Sep 5 06:03:09.733765 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 5 06:03:09.733770 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 5 06:03:09.733775 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Sep 5 06:03:09.733780 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Sep 5 06:03:09.733787 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Sep 5 06:03:09.733792 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Sep 5 06:03:09.733797 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Sep 5 06:03:09.733802 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Sep 5 06:03:09.733807 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 5 06:03:09.733812 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 5 06:03:09.733817 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Sep 5 06:03:09.733822 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Sep 5 06:03:09.733828 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Sep 5 06:03:09.733834 kernel: Zone ranges: Sep 5 06:03:09.733839 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 5 06:03:09.733845 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Sep 5 06:03:09.733850 kernel: Normal empty Sep 5 06:03:09.733855 kernel: Device empty Sep 5 06:03:09.733860 kernel: Movable zone start for each node Sep 5 06:03:09.733865 kernel: Early memory node ranges Sep 5 06:03:09.733870 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Sep 5 06:03:09.733875 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Sep 5 06:03:09.733881 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Sep 5 06:03:09.733887 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Sep 5 06:03:09.733892 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 5 06:03:09.733897 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Sep 5 06:03:09.733902 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Sep 5 06:03:09.733907 kernel: ACPI: PM-Timer IO Port: 0x1008 Sep 5 06:03:09.733912 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Sep 5 06:03:09.733917 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 5 06:03:09.733923 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 5 06:03:09.733928 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 5 06:03:09.733934 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 5 06:03:09.733939 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 5 06:03:09.733944 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 5 06:03:09.733949 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 5 06:03:09.733954 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 5 06:03:09.733959 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 5 06:03:09.733965 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 5 06:03:09.733969 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 5 06:03:09.733975 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 5 06:03:09.733981 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 5 06:03:09.733986 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 5 06:03:09.733991 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 5 06:03:09.733996 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 5 06:03:09.734001 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Sep 5 06:03:09.734006 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Sep 5 06:03:09.734011 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Sep 5 06:03:09.734017 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Sep 5 06:03:09.734022 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Sep 5 06:03:09.734027 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Sep 5 06:03:09.734033 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Sep 5 06:03:09.734038 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Sep 5 06:03:09.734043 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Sep 5 06:03:09.734049 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Sep 5 06:03:09.734054 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Sep 5 06:03:09.734059 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Sep 5 06:03:09.734064 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Sep 5 06:03:09.734069 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Sep 5 06:03:09.734074 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Sep 5 06:03:09.734080 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Sep 5 06:03:09.734085 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Sep 5 06:03:09.734090 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Sep 5 06:03:09.734095 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Sep 5 06:03:09.734100 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Sep 5 06:03:09.734106 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Sep 5 06:03:09.734111 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Sep 5 06:03:09.734116 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Sep 5 06:03:09.734125 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Sep 5 06:03:09.734131 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Sep 5 06:03:09.734136 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Sep 5 06:03:09.734142 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Sep 5 06:03:09.734148 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Sep 5 06:03:09.734153 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Sep 5 06:03:09.734159 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Sep 5 06:03:09.734164 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Sep 5 06:03:09.734169 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Sep 5 06:03:09.734176 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Sep 5 06:03:09.734182 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Sep 5 06:03:09.734187 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Sep 5 06:03:09.734192 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Sep 5 06:03:09.734198 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Sep 5 06:03:09.734203 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Sep 5 06:03:09.734208 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Sep 5 06:03:09.734214 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Sep 5 06:03:09.734219 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Sep 5 06:03:09.734225 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Sep 5 06:03:09.734231 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Sep 5 06:03:09.734237 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Sep 5 06:03:09.734242 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Sep 5 06:03:09.734247 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Sep 5 06:03:09.734253 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Sep 5 06:03:09.734258 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Sep 5 06:03:09.734263 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Sep 5 06:03:09.734269 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Sep 5 06:03:09.734274 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Sep 5 06:03:09.734279 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Sep 5 06:03:09.734286 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Sep 5 06:03:09.734291 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Sep 5 06:03:09.734296 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Sep 5 06:03:09.734302 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Sep 5 06:03:09.734307 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Sep 5 06:03:09.734312 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Sep 5 06:03:09.734320 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Sep 5 06:03:09.734326 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Sep 5 06:03:09.734331 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Sep 5 06:03:09.734338 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Sep 5 06:03:09.734343 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Sep 5 06:03:09.734349 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Sep 5 06:03:09.734354 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Sep 5 06:03:09.734359 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Sep 5 06:03:09.734365 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Sep 5 06:03:09.734372 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Sep 5 06:03:09.734378 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Sep 5 06:03:09.734387 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Sep 5 06:03:09.734394 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Sep 5 06:03:09.734402 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Sep 5 06:03:09.734407 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Sep 5 06:03:09.734412 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Sep 5 06:03:09.734418 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Sep 5 06:03:09.734423 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Sep 5 06:03:09.734429 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Sep 5 06:03:09.734434 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Sep 5 06:03:09.734439 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Sep 5 06:03:09.734445 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Sep 5 06:03:09.734450 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Sep 5 06:03:09.734457 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Sep 5 06:03:09.734462 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Sep 5 06:03:09.734467 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Sep 5 06:03:09.734473 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Sep 5 06:03:09.734478 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Sep 5 06:03:09.734483 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Sep 5 06:03:09.734489 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Sep 5 06:03:09.734494 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Sep 5 06:03:09.734500 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Sep 5 06:03:09.734506 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Sep 5 06:03:09.734512 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Sep 5 06:03:09.734517 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Sep 5 06:03:09.734522 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Sep 5 06:03:09.734528 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Sep 5 06:03:09.734533 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Sep 5 06:03:09.734538 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Sep 5 06:03:09.734544 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Sep 5 06:03:09.734549 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Sep 5 06:03:09.734566 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Sep 5 06:03:09.734574 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Sep 5 06:03:09.734579 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Sep 5 06:03:09.734584 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Sep 5 06:03:09.734590 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Sep 5 06:03:09.734595 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Sep 5 06:03:09.734600 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Sep 5 06:03:09.734606 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Sep 5 06:03:09.734611 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Sep 5 06:03:09.734616 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Sep 5 06:03:09.734623 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Sep 5 06:03:09.734628 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Sep 5 06:03:09.734634 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Sep 5 06:03:09.734639 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Sep 5 06:03:09.734645 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 5 06:03:09.734650 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Sep 5 06:03:09.734656 kernel: TSC deadline timer available Sep 5 06:03:09.734661 kernel: CPU topo: Max. logical packages: 128 Sep 5 06:03:09.734667 kernel: CPU topo: Max. logical dies: 128 Sep 5 06:03:09.734672 kernel: CPU topo: Max. dies per package: 1 Sep 5 06:03:09.734679 kernel: CPU topo: Max. threads per core: 1 Sep 5 06:03:09.734684 kernel: CPU topo: Num. cores per package: 1 Sep 5 06:03:09.734689 kernel: CPU topo: Num. threads per package: 1 Sep 5 06:03:09.734695 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Sep 5 06:03:09.734700 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Sep 5 06:03:09.734706 kernel: Booting paravirtualized kernel on VMware hypervisor Sep 5 06:03:09.734711 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 5 06:03:09.734717 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Sep 5 06:03:09.734722 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 5 06:03:09.734729 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 5 06:03:09.734734 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Sep 5 06:03:09.734740 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Sep 5 06:03:09.734745 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Sep 5 06:03:09.734751 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Sep 5 06:03:09.734756 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Sep 5 06:03:09.734761 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Sep 5 06:03:09.734766 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Sep 5 06:03:09.734772 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Sep 5 06:03:09.734778 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Sep 5 06:03:09.734783 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Sep 5 06:03:09.734789 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Sep 5 06:03:09.734794 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Sep 5 06:03:09.734800 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Sep 5 06:03:09.734805 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Sep 5 06:03:09.734810 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Sep 5 06:03:09.734816 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Sep 5 06:03:09.734823 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=4b2174e9c368fa97600991ce20efc370fbbf3ddfce3ea407f50212a1021bd496 Sep 5 06:03:09.734829 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 5 06:03:09.734834 kernel: random: crng init done Sep 5 06:03:09.734840 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Sep 5 06:03:09.734845 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Sep 5 06:03:09.734851 kernel: printk: log_buf_len min size: 262144 bytes Sep 5 06:03:09.734856 kernel: printk: log_buf_len: 1048576 bytes Sep 5 06:03:09.734862 kernel: printk: early log buf free: 245592(93%) Sep 5 06:03:09.734867 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 5 06:03:09.734874 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 5 06:03:09.734879 kernel: Fallback order for Node 0: 0 Sep 5 06:03:09.734885 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Sep 5 06:03:09.734890 kernel: Policy zone: DMA32 Sep 5 06:03:09.734896 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 5 06:03:09.734901 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Sep 5 06:03:09.734907 kernel: ftrace: allocating 40102 entries in 157 pages Sep 5 06:03:09.734912 kernel: ftrace: allocated 157 pages with 5 groups Sep 5 06:03:09.734920 kernel: Dynamic Preempt: voluntary Sep 5 06:03:09.734927 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 5 06:03:09.734934 kernel: rcu: RCU event tracing is enabled. Sep 5 06:03:09.734943 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Sep 5 06:03:09.734949 kernel: Trampoline variant of Tasks RCU enabled. Sep 5 06:03:09.734955 kernel: Rude variant of Tasks RCU enabled. Sep 5 06:03:09.734961 kernel: Tracing variant of Tasks RCU enabled. Sep 5 06:03:09.734966 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 5 06:03:09.734971 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Sep 5 06:03:09.734978 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 5 06:03:09.734985 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 5 06:03:09.734990 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 5 06:03:09.734998 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Sep 5 06:03:09.735004 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Sep 5 06:03:09.735009 kernel: Console: colour VGA+ 80x25 Sep 5 06:03:09.735015 kernel: printk: legacy console [tty0] enabled Sep 5 06:03:09.735024 kernel: printk: legacy console [ttyS0] enabled Sep 5 06:03:09.735030 kernel: ACPI: Core revision 20240827 Sep 5 06:03:09.735039 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Sep 5 06:03:09.735047 kernel: APIC: Switch to symmetric I/O mode setup Sep 5 06:03:09.735053 kernel: x2apic enabled Sep 5 06:03:09.735058 kernel: APIC: Switched APIC routing to: physical x2apic Sep 5 06:03:09.735064 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 5 06:03:09.735070 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 5 06:03:09.735075 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Sep 5 06:03:09.735081 kernel: Disabled fast string operations Sep 5 06:03:09.735086 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 5 06:03:09.735091 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 5 06:03:09.735098 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 5 06:03:09.735104 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 5 06:03:09.735109 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 5 06:03:09.735115 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 5 06:03:09.735120 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 5 06:03:09.735126 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 5 06:03:09.735131 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 5 06:03:09.735137 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 5 06:03:09.735142 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 5 06:03:09.735149 kernel: GDS: Unknown: Dependent on hypervisor status Sep 5 06:03:09.735154 kernel: active return thunk: its_return_thunk Sep 5 06:03:09.735160 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 5 06:03:09.735165 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 5 06:03:09.735171 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 5 06:03:09.735176 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 5 06:03:09.735182 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 5 06:03:09.735187 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 5 06:03:09.735193 kernel: Freeing SMP alternatives memory: 32K Sep 5 06:03:09.735200 kernel: pid_max: default: 131072 minimum: 1024 Sep 5 06:03:09.735205 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 5 06:03:09.735211 kernel: landlock: Up and running. Sep 5 06:03:09.735216 kernel: SELinux: Initializing. Sep 5 06:03:09.735222 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 5 06:03:09.735227 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 5 06:03:09.735233 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 5 06:03:09.735238 kernel: Performance Events: Skylake events, core PMU driver. Sep 5 06:03:09.735244 kernel: core: CPUID marked event: 'cpu cycles' unavailable Sep 5 06:03:09.735250 kernel: core: CPUID marked event: 'instructions' unavailable Sep 5 06:03:09.735256 kernel: core: CPUID marked event: 'bus cycles' unavailable Sep 5 06:03:09.735261 kernel: core: CPUID marked event: 'cache references' unavailable Sep 5 06:03:09.735266 kernel: core: CPUID marked event: 'cache misses' unavailable Sep 5 06:03:09.735272 kernel: core: CPUID marked event: 'branch instructions' unavailable Sep 5 06:03:09.735277 kernel: core: CPUID marked event: 'branch misses' unavailable Sep 5 06:03:09.735283 kernel: ... version: 1 Sep 5 06:03:09.735288 kernel: ... bit width: 48 Sep 5 06:03:09.735295 kernel: ... generic registers: 4 Sep 5 06:03:09.735300 kernel: ... value mask: 0000ffffffffffff Sep 5 06:03:09.735306 kernel: ... max period: 000000007fffffff Sep 5 06:03:09.735311 kernel: ... fixed-purpose events: 0 Sep 5 06:03:09.735316 kernel: ... event mask: 000000000000000f Sep 5 06:03:09.735322 kernel: signal: max sigframe size: 1776 Sep 5 06:03:09.735327 kernel: rcu: Hierarchical SRCU implementation. Sep 5 06:03:09.735333 kernel: rcu: Max phase no-delay instances is 400. Sep 5 06:03:09.735339 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Sep 5 06:03:09.735344 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 5 06:03:09.735351 kernel: smp: Bringing up secondary CPUs ... Sep 5 06:03:09.735356 kernel: smpboot: x86: Booting SMP configuration: Sep 5 06:03:09.735362 kernel: .... node #0, CPUs: #1 Sep 5 06:03:09.735367 kernel: Disabled fast string operations Sep 5 06:03:09.735372 kernel: smp: Brought up 1 node, 2 CPUs Sep 5 06:03:09.735378 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Sep 5 06:03:09.735384 kernel: Memory: 1924248K/2096628K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54068K init, 2900K bss, 161004K reserved, 0K cma-reserved) Sep 5 06:03:09.735389 kernel: devtmpfs: initialized Sep 5 06:03:09.735395 kernel: x86/mm: Memory block size: 128MB Sep 5 06:03:09.735401 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Sep 5 06:03:09.735407 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 5 06:03:09.735412 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Sep 5 06:03:09.735418 kernel: pinctrl core: initialized pinctrl subsystem Sep 5 06:03:09.735423 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 5 06:03:09.735429 kernel: audit: initializing netlink subsys (disabled) Sep 5 06:03:09.735434 kernel: audit: type=2000 audit(1757052186.304:1): state=initialized audit_enabled=0 res=1 Sep 5 06:03:09.735440 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 5 06:03:09.735445 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 5 06:03:09.735452 kernel: cpuidle: using governor menu Sep 5 06:03:09.735457 kernel: Simple Boot Flag at 0x36 set to 0x80 Sep 5 06:03:09.735463 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 5 06:03:09.735468 kernel: dca service started, version 1.12.1 Sep 5 06:03:09.735481 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Sep 5 06:03:09.735488 kernel: PCI: Using configuration type 1 for base access Sep 5 06:03:09.735494 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 5 06:03:09.735500 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 5 06:03:09.735506 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 5 06:03:09.735513 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 5 06:03:09.735518 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 5 06:03:09.735524 kernel: ACPI: Added _OSI(Module Device) Sep 5 06:03:09.735530 kernel: ACPI: Added _OSI(Processor Device) Sep 5 06:03:09.735536 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 5 06:03:09.735541 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 5 06:03:09.735547 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 5 06:03:09.735553 kernel: ACPI: Interpreter enabled Sep 5 06:03:09.736617 kernel: ACPI: PM: (supports S0 S1 S5) Sep 5 06:03:09.736627 kernel: ACPI: Using IOAPIC for interrupt routing Sep 5 06:03:09.736634 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 5 06:03:09.736640 kernel: PCI: Using E820 reservations for host bridge windows Sep 5 06:03:09.736652 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Sep 5 06:03:09.736659 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Sep 5 06:03:09.736747 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 5 06:03:09.736803 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Sep 5 06:03:09.736856 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Sep 5 06:03:09.736867 kernel: PCI host bridge to bus 0000:00 Sep 5 06:03:09.736922 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 5 06:03:09.736976 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Sep 5 06:03:09.737031 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 5 06:03:09.737088 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 5 06:03:09.737134 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Sep 5 06:03:09.737181 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Sep 5 06:03:09.737241 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Sep 5 06:03:09.737303 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Sep 5 06:03:09.737356 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 5 06:03:09.737415 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 5 06:03:09.737471 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Sep 5 06:03:09.737538 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Sep 5 06:03:09.737617 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Sep 5 06:03:09.737689 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Sep 5 06:03:09.737757 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Sep 5 06:03:09.737828 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Sep 5 06:03:09.737885 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 5 06:03:09.737936 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Sep 5 06:03:09.737986 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Sep 5 06:03:09.738041 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Sep 5 06:03:09.738092 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Sep 5 06:03:09.738154 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Sep 5 06:03:09.738213 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Sep 5 06:03:09.738264 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Sep 5 06:03:09.738315 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Sep 5 06:03:09.738379 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Sep 5 06:03:09.738437 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Sep 5 06:03:09.738493 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 5 06:03:09.739614 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Sep 5 06:03:09.739688 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 5 06:03:09.739743 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 5 06:03:09.739809 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 5 06:03:09.739861 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 5 06:03:09.739919 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.739973 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 5 06:03:09.740027 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 5 06:03:09.740078 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 5 06:03:09.740131 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.740186 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.740242 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 5 06:03:09.740302 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 5 06:03:09.740354 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 5 06:03:09.740416 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 5 06:03:09.740492 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.740586 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.740646 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 5 06:03:09.740699 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 5 06:03:09.740750 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 5 06:03:09.740801 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 5 06:03:09.740858 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.740922 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.740977 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 5 06:03:09.741031 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 5 06:03:09.741094 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 5 06:03:09.741147 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.741202 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.741257 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 5 06:03:09.741307 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 5 06:03:09.741357 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 5 06:03:09.741407 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.741462 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.741515 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 5 06:03:09.742639 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 5 06:03:09.742706 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 5 06:03:09.742762 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.742848 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.742905 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 5 06:03:09.742960 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 5 06:03:09.743012 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 5 06:03:09.743063 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.743121 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.743174 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 5 06:03:09.743225 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 5 06:03:09.743276 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 5 06:03:09.743327 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.743382 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.743438 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 5 06:03:09.743498 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 5 06:03:09.743552 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 5 06:03:09.743630 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.743688 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.743740 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 5 06:03:09.743790 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 5 06:03:09.743840 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 5 06:03:09.743894 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 5 06:03:09.743944 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.743999 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.744060 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 5 06:03:09.744115 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 5 06:03:09.744169 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 5 06:03:09.744220 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 5 06:03:09.744273 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.744330 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.744381 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 5 06:03:09.744431 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 5 06:03:09.744481 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 5 06:03:09.744537 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.744633 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.744698 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 5 06:03:09.744757 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 5 06:03:09.744812 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 5 06:03:09.744874 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.744935 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.744988 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 5 06:03:09.745039 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 5 06:03:09.746437 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 5 06:03:09.746512 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.746594 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.746668 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 5 06:03:09.746734 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 5 06:03:09.746786 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 5 06:03:09.746837 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.746907 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.746965 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 5 06:03:09.747016 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 5 06:03:09.747072 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 5 06:03:09.747131 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.747199 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.747260 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 5 06:03:09.747312 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 5 06:03:09.747365 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 5 06:03:09.747415 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 5 06:03:09.747465 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.747520 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.747589 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 5 06:03:09.747645 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 5 06:03:09.747699 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 5 06:03:09.747780 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 5 06:03:09.747850 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.747907 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.748015 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 5 06:03:09.748071 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 5 06:03:09.748123 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 5 06:03:09.748173 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 5 06:03:09.748222 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.748285 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.748342 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 5 06:03:09.748406 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 5 06:03:09.748463 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 5 06:03:09.748514 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.749051 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.749111 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 5 06:03:09.749164 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 5 06:03:09.749217 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 5 06:03:09.749268 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.749328 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.749381 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 5 06:03:09.749436 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 5 06:03:09.749491 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 5 06:03:09.749551 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.751644 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.751702 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 5 06:03:09.751759 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 5 06:03:09.751822 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 5 06:03:09.751877 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.751944 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.752000 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 5 06:03:09.752060 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 5 06:03:09.752112 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 5 06:03:09.752165 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.752222 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.752274 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 5 06:03:09.752325 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 5 06:03:09.752384 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 5 06:03:09.752441 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 5 06:03:09.752496 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.752580 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.752649 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 5 06:03:09.752701 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 5 06:03:09.752751 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 5 06:03:09.752801 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 5 06:03:09.752868 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.752926 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.752981 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 5 06:03:09.753036 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 5 06:03:09.753094 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 5 06:03:09.753156 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.753211 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.753266 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 5 06:03:09.753327 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 5 06:03:09.753381 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 5 06:03:09.753432 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.753491 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.753543 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 5 06:03:09.753615 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 5 06:03:09.753682 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 5 06:03:09.754615 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.754674 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.754730 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 5 06:03:09.754781 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 5 06:03:09.754832 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 5 06:03:09.754883 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.754940 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.754998 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 5 06:03:09.755054 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 5 06:03:09.755119 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 5 06:03:09.755171 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.755234 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 5 06:03:09.755290 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 5 06:03:09.755340 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 5 06:03:09.755390 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 5 06:03:09.755440 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.755500 kernel: pci_bus 0000:01: extended config space not accessible Sep 5 06:03:09.756572 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 5 06:03:09.756638 kernel: pci_bus 0000:02: extended config space not accessible Sep 5 06:03:09.756651 kernel: acpiphp: Slot [32] registered Sep 5 06:03:09.756658 kernel: acpiphp: Slot [33] registered Sep 5 06:03:09.756664 kernel: acpiphp: Slot [34] registered Sep 5 06:03:09.756670 kernel: acpiphp: Slot [35] registered Sep 5 06:03:09.756676 kernel: acpiphp: Slot [36] registered Sep 5 06:03:09.756688 kernel: acpiphp: Slot [37] registered Sep 5 06:03:09.756698 kernel: acpiphp: Slot [38] registered Sep 5 06:03:09.756704 kernel: acpiphp: Slot [39] registered Sep 5 06:03:09.756710 kernel: acpiphp: Slot [40] registered Sep 5 06:03:09.756716 kernel: acpiphp: Slot [41] registered Sep 5 06:03:09.756722 kernel: acpiphp: Slot [42] registered Sep 5 06:03:09.756728 kernel: acpiphp: Slot [43] registered Sep 5 06:03:09.756733 kernel: acpiphp: Slot [44] registered Sep 5 06:03:09.756739 kernel: acpiphp: Slot [45] registered Sep 5 06:03:09.756745 kernel: acpiphp: Slot [46] registered Sep 5 06:03:09.756753 kernel: acpiphp: Slot [47] registered Sep 5 06:03:09.756758 kernel: acpiphp: Slot [48] registered Sep 5 06:03:09.756764 kernel: acpiphp: Slot [49] registered Sep 5 06:03:09.756770 kernel: acpiphp: Slot [50] registered Sep 5 06:03:09.756776 kernel: acpiphp: Slot [51] registered Sep 5 06:03:09.756782 kernel: acpiphp: Slot [52] registered Sep 5 06:03:09.756788 kernel: acpiphp: Slot [53] registered Sep 5 06:03:09.756793 kernel: acpiphp: Slot [54] registered Sep 5 06:03:09.756799 kernel: acpiphp: Slot [55] registered Sep 5 06:03:09.756806 kernel: acpiphp: Slot [56] registered Sep 5 06:03:09.756812 kernel: acpiphp: Slot [57] registered Sep 5 06:03:09.756817 kernel: acpiphp: Slot [58] registered Sep 5 06:03:09.756823 kernel: acpiphp: Slot [59] registered Sep 5 06:03:09.756829 kernel: acpiphp: Slot [60] registered Sep 5 06:03:09.756834 kernel: acpiphp: Slot [61] registered Sep 5 06:03:09.756840 kernel: acpiphp: Slot [62] registered Sep 5 06:03:09.756846 kernel: acpiphp: Slot [63] registered Sep 5 06:03:09.756902 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 5 06:03:09.756958 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Sep 5 06:03:09.757015 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Sep 5 06:03:09.757075 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Sep 5 06:03:09.757130 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Sep 5 06:03:09.757181 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Sep 5 06:03:09.757242 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Sep 5 06:03:09.757295 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Sep 5 06:03:09.757350 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Sep 5 06:03:09.757401 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 5 06:03:09.757451 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 5 06:03:09.757502 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 5 06:03:09.757563 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 5 06:03:09.757634 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 5 06:03:09.757697 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 5 06:03:09.757752 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 5 06:03:09.757817 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 5 06:03:09.757870 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 5 06:03:09.757922 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 5 06:03:09.757975 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 5 06:03:09.758034 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Sep 5 06:03:09.758094 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Sep 5 06:03:09.758173 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Sep 5 06:03:09.758237 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Sep 5 06:03:09.758290 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Sep 5 06:03:09.758345 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 5 06:03:09.758403 kernel: pci 0000:0b:00.0: supports D1 D2 Sep 5 06:03:09.758455 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 5 06:03:09.758507 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 5 06:03:09.761601 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 5 06:03:09.761701 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 5 06:03:09.761776 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 5 06:03:09.761839 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 5 06:03:09.761903 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 5 06:03:09.761976 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 5 06:03:09.762047 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 5 06:03:09.762148 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 5 06:03:09.762227 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 5 06:03:09.762296 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 5 06:03:09.762352 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 5 06:03:09.762406 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 5 06:03:09.762460 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 5 06:03:09.762513 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 5 06:03:09.762582 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 5 06:03:09.762651 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 5 06:03:09.762716 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 5 06:03:09.762772 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 5 06:03:09.762841 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 5 06:03:09.762897 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 5 06:03:09.762949 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 5 06:03:09.763036 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 5 06:03:09.763090 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 5 06:03:09.763147 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 5 06:03:09.763162 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Sep 5 06:03:09.763170 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Sep 5 06:03:09.763177 kernel: ACPI: PCI: Interrupt link LNKB disabled Sep 5 06:03:09.763183 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 5 06:03:09.763189 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Sep 5 06:03:09.763195 kernel: iommu: Default domain type: Translated Sep 5 06:03:09.763201 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 5 06:03:09.763207 kernel: PCI: Using ACPI for IRQ routing Sep 5 06:03:09.763215 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 5 06:03:09.763221 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Sep 5 06:03:09.763227 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Sep 5 06:03:09.763294 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Sep 5 06:03:09.763376 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Sep 5 06:03:09.763432 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 5 06:03:09.763441 kernel: vgaarb: loaded Sep 5 06:03:09.763451 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Sep 5 06:03:09.763457 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Sep 5 06:03:09.763469 kernel: clocksource: Switched to clocksource tsc-early Sep 5 06:03:09.763477 kernel: VFS: Disk quotas dquot_6.6.0 Sep 5 06:03:09.763483 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 5 06:03:09.763489 kernel: pnp: PnP ACPI init Sep 5 06:03:09.763549 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Sep 5 06:03:09.763621 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Sep 5 06:03:09.763669 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Sep 5 06:03:09.763720 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Sep 5 06:03:09.763774 kernel: pnp 00:06: [dma 2] Sep 5 06:03:09.763829 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Sep 5 06:03:09.763891 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Sep 5 06:03:09.763948 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Sep 5 06:03:09.763957 kernel: pnp: PnP ACPI: found 8 devices Sep 5 06:03:09.763964 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 5 06:03:09.763970 kernel: NET: Registered PF_INET protocol family Sep 5 06:03:09.763978 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 5 06:03:09.763984 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 5 06:03:09.763990 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 5 06:03:09.763996 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 5 06:03:09.764001 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 5 06:03:09.764008 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 5 06:03:09.764013 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 5 06:03:09.764019 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 5 06:03:09.764026 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 5 06:03:09.764032 kernel: NET: Registered PF_XDP protocol family Sep 5 06:03:09.764088 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Sep 5 06:03:09.764141 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 5 06:03:09.764193 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 5 06:03:09.764246 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 5 06:03:09.764299 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 5 06:03:09.764350 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Sep 5 06:03:09.764406 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Sep 5 06:03:09.764457 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Sep 5 06:03:09.764524 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Sep 5 06:03:09.764606 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Sep 5 06:03:09.764676 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Sep 5 06:03:09.764743 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Sep 5 06:03:09.764803 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Sep 5 06:03:09.764863 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Sep 5 06:03:09.764925 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Sep 5 06:03:09.764982 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Sep 5 06:03:09.765040 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Sep 5 06:03:09.765100 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Sep 5 06:03:09.765164 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Sep 5 06:03:09.765231 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Sep 5 06:03:09.765291 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Sep 5 06:03:09.765360 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Sep 5 06:03:09.765425 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Sep 5 06:03:09.765483 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Sep 5 06:03:09.765541 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Sep 5 06:03:09.765608 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.765666 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.765725 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.765797 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.765863 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.765934 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.766203 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.766377 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.766824 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.766915 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.766991 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.767053 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.767121 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.767182 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.767251 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.767321 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.767391 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.767456 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.767522 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.767602 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.767674 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.767739 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.767792 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.767844 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.767895 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.767953 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.768020 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.768081 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.768135 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.768189 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.768262 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.768322 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.768392 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.768459 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.768531 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.768627 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.768685 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.768736 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.768789 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.768849 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.768913 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.768982 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.769049 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.770638 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.770713 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.770775 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.770840 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.770903 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.770965 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.771039 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.771115 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.771181 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.771244 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.771310 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.771365 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.771420 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.771488 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.771552 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.771632 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.771689 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.771767 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.771833 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.771900 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.771968 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.772033 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.772097 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.772158 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.772217 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.772282 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.773260 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.773337 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.773405 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.773468 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.773528 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.773635 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.773714 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.773786 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.773846 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.774327 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.774417 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.774504 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.774714 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.774800 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 5 06:03:09.774871 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 5 06:03:09.774954 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 5 06:03:09.775458 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Sep 5 06:03:09.775585 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 5 06:03:09.775655 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 5 06:03:09.775724 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 5 06:03:09.775781 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Sep 5 06:03:09.775845 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 5 06:03:09.775902 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 5 06:03:09.775959 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 5 06:03:09.776024 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Sep 5 06:03:09.776088 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 5 06:03:09.776200 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 5 06:03:09.776272 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 5 06:03:09.776332 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 5 06:03:09.776401 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 5 06:03:09.776455 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 5 06:03:09.776507 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 5 06:03:09.776602 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 5 06:03:09.776673 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 5 06:03:09.776736 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 5 06:03:09.776798 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 5 06:03:09.776856 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 5 06:03:09.776914 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 5 06:03:09.776965 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 5 06:03:09.777036 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 5 06:03:09.777096 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 5 06:03:09.777151 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 5 06:03:09.777204 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 5 06:03:09.777258 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 5 06:03:09.777320 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 5 06:03:09.777375 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 5 06:03:09.777434 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 5 06:03:09.777495 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 5 06:03:09.779611 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Sep 5 06:03:09.779687 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 5 06:03:09.779745 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 5 06:03:09.779816 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 5 06:03:09.779872 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Sep 5 06:03:09.779952 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 5 06:03:09.780013 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 5 06:03:09.780078 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 5 06:03:09.780130 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 5 06:03:09.780187 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 5 06:03:09.780239 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 5 06:03:09.780303 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 5 06:03:09.780373 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 5 06:03:09.780438 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 5 06:03:09.780490 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 5 06:03:09.780575 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 5 06:03:09.780637 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 5 06:03:09.780713 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 5 06:03:09.780775 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 5 06:03:09.780842 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 5 06:03:09.780910 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 5 06:03:09.780961 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 5 06:03:09.781024 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 5 06:03:09.781086 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 5 06:03:09.781138 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 5 06:03:09.781200 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 5 06:03:09.781257 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 5 06:03:09.781311 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 5 06:03:09.781377 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 5 06:03:09.781445 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 5 06:03:09.781502 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 5 06:03:09.784307 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 5 06:03:09.784390 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 5 06:03:09.784452 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 5 06:03:09.784519 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 5 06:03:09.784596 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 5 06:03:09.784663 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 5 06:03:09.784734 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 5 06:03:09.784790 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 5 06:03:09.784841 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 5 06:03:09.784903 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 5 06:03:09.784960 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 5 06:03:09.785012 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 5 06:03:09.785069 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 5 06:03:09.785130 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 5 06:03:09.785191 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 5 06:03:09.785266 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 5 06:03:09.785318 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 5 06:03:09.785374 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 5 06:03:09.785438 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 5 06:03:09.785504 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 5 06:03:09.785593 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 5 06:03:09.785750 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 5 06:03:09.785807 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 5 06:03:09.785859 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 5 06:03:09.785925 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 5 06:03:09.785988 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 5 06:03:09.786048 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 5 06:03:09.786107 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 5 06:03:09.786168 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 5 06:03:09.786222 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 5 06:03:09.786279 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 5 06:03:09.786348 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 5 06:03:09.786412 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 5 06:03:09.786474 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 5 06:03:09.786528 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 5 06:03:09.786608 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 5 06:03:09.786672 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 5 06:03:09.786730 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 5 06:03:09.786796 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 5 06:03:09.786853 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 5 06:03:09.786906 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 5 06:03:09.786970 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 5 06:03:09.787039 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 5 06:03:09.787100 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 5 06:03:09.787159 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 5 06:03:09.787215 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 5 06:03:09.787265 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 5 06:03:09.787318 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 5 06:03:09.787368 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 5 06:03:09.787428 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 5 06:03:09.787491 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Sep 5 06:03:09.787549 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 5 06:03:09.787640 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 5 06:03:09.787933 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Sep 5 06:03:09.787987 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Sep 5 06:03:09.788051 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Sep 5 06:03:09.788101 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Sep 5 06:03:09.788152 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 5 06:03:09.788217 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Sep 5 06:03:09.788280 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 5 06:03:09.788334 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 5 06:03:09.788382 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Sep 5 06:03:09.788442 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Sep 5 06:03:09.788495 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Sep 5 06:03:09.788542 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Sep 5 06:03:09.788611 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Sep 5 06:03:09.788679 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Sep 5 06:03:09.788825 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Sep 5 06:03:09.788873 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Sep 5 06:03:09.788925 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Sep 5 06:03:09.788972 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Sep 5 06:03:09.789019 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Sep 5 06:03:09.789075 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Sep 5 06:03:09.789138 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Sep 5 06:03:09.789205 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Sep 5 06:03:09.789270 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 5 06:03:09.789322 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Sep 5 06:03:09.789370 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Sep 5 06:03:09.789425 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Sep 5 06:03:09.789477 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Sep 5 06:03:09.789548 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Sep 5 06:03:09.789621 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Sep 5 06:03:09.789672 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Sep 5 06:03:09.789719 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Sep 5 06:03:09.789765 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Sep 5 06:03:09.789828 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Sep 5 06:03:09.789876 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Sep 5 06:03:09.789921 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Sep 5 06:03:09.789973 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Sep 5 06:03:09.790020 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Sep 5 06:03:09.790065 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Sep 5 06:03:09.790118 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Sep 5 06:03:09.790168 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 5 06:03:09.790229 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Sep 5 06:03:09.790319 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 5 06:03:09.790387 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Sep 5 06:03:09.790446 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Sep 5 06:03:09.790499 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Sep 5 06:03:09.790594 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Sep 5 06:03:09.790657 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Sep 5 06:03:09.790704 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 5 06:03:09.790777 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Sep 5 06:03:09.790834 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Sep 5 06:03:09.790892 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 5 06:03:09.790978 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Sep 5 06:03:09.791031 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Sep 5 06:03:09.791078 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Sep 5 06:03:09.791128 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Sep 5 06:03:09.791188 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Sep 5 06:03:09.791237 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Sep 5 06:03:09.791288 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Sep 5 06:03:09.791352 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 5 06:03:09.791425 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Sep 5 06:03:09.791501 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 5 06:03:09.791566 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Sep 5 06:03:09.791626 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Sep 5 06:03:09.791686 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Sep 5 06:03:09.791737 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Sep 5 06:03:09.791788 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Sep 5 06:03:09.791835 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 5 06:03:09.791886 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Sep 5 06:03:09.791932 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Sep 5 06:03:09.791978 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Sep 5 06:03:09.792031 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Sep 5 06:03:09.792079 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Sep 5 06:03:09.792127 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Sep 5 06:03:09.792178 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Sep 5 06:03:09.792225 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Sep 5 06:03:09.792275 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Sep 5 06:03:09.792322 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 5 06:03:09.792375 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Sep 5 06:03:09.792429 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Sep 5 06:03:09.792486 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Sep 5 06:03:09.792532 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Sep 5 06:03:09.794631 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Sep 5 06:03:09.794690 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Sep 5 06:03:09.794747 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Sep 5 06:03:09.794796 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 5 06:03:09.794870 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 5 06:03:09.794884 kernel: PCI: CLS 32 bytes, default 64 Sep 5 06:03:09.794891 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 5 06:03:09.794901 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 5 06:03:09.794910 kernel: clocksource: Switched to clocksource tsc Sep 5 06:03:09.794920 kernel: Initialise system trusted keyrings Sep 5 06:03:09.794932 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 5 06:03:09.794939 kernel: Key type asymmetric registered Sep 5 06:03:09.794944 kernel: Asymmetric key parser 'x509' registered Sep 5 06:03:09.794953 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 5 06:03:09.794964 kernel: io scheduler mq-deadline registered Sep 5 06:03:09.794974 kernel: io scheduler kyber registered Sep 5 06:03:09.794983 kernel: io scheduler bfq registered Sep 5 06:03:09.795051 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Sep 5 06:03:09.795130 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.795197 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Sep 5 06:03:09.795263 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.795326 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Sep 5 06:03:09.795387 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.795459 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Sep 5 06:03:09.795531 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.795738 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Sep 5 06:03:09.795841 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.795924 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Sep 5 06:03:09.795988 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.796053 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Sep 5 06:03:09.796118 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.796180 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Sep 5 06:03:09.796252 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.796332 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Sep 5 06:03:09.796391 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.796459 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Sep 5 06:03:09.796519 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.797469 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Sep 5 06:03:09.797543 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.797622 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Sep 5 06:03:09.797694 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.797761 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Sep 5 06:03:09.797824 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.797889 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Sep 5 06:03:09.797952 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.798016 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Sep 5 06:03:09.798077 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.798152 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Sep 5 06:03:09.798217 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.798280 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Sep 5 06:03:09.798340 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.798403 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Sep 5 06:03:09.798466 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.798535 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Sep 5 06:03:09.798626 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.798698 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Sep 5 06:03:09.798763 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.798839 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Sep 5 06:03:09.798903 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.798977 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Sep 5 06:03:09.799035 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.799098 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Sep 5 06:03:09.799161 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.799221 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Sep 5 06:03:09.799273 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.799333 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Sep 5 06:03:09.799400 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.799464 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Sep 5 06:03:09.799524 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.799612 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Sep 5 06:03:09.799679 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.799741 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Sep 5 06:03:09.799807 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.799876 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Sep 5 06:03:09.799941 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.800003 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Sep 5 06:03:09.800065 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.800132 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Sep 5 06:03:09.800202 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.800265 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Sep 5 06:03:09.800326 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 5 06:03:09.800339 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 5 06:03:09.800346 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 5 06:03:09.800353 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 5 06:03:09.800363 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Sep 5 06:03:09.800372 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 5 06:03:09.800379 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 5 06:03:09.800441 kernel: rtc_cmos 00:01: registered as rtc0 Sep 5 06:03:09.800503 kernel: rtc_cmos 00:01: setting system clock to 2025-09-05T06:03:09 UTC (1757052189) Sep 5 06:03:09.800518 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 5 06:03:09.800609 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Sep 5 06:03:09.800623 kernel: intel_pstate: CPU model not supported Sep 5 06:03:09.800633 kernel: NET: Registered PF_INET6 protocol family Sep 5 06:03:09.800643 kernel: Segment Routing with IPv6 Sep 5 06:03:09.800650 kernel: In-situ OAM (IOAM) with IPv6 Sep 5 06:03:09.800656 kernel: NET: Registered PF_PACKET protocol family Sep 5 06:03:09.800662 kernel: Key type dns_resolver registered Sep 5 06:03:09.800670 kernel: IPI shorthand broadcast: enabled Sep 5 06:03:09.800679 kernel: sched_clock: Marking stable (2840004200, 181541936)->(3036294806, -14748670) Sep 5 06:03:09.800686 kernel: registered taskstats version 1 Sep 5 06:03:09.800694 kernel: Loading compiled-in X.509 certificates Sep 5 06:03:09.800704 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 0a288d3740f799f7923bd7314e999f997bd1026c' Sep 5 06:03:09.800712 kernel: Demotion targets for Node 0: null Sep 5 06:03:09.800718 kernel: Key type .fscrypt registered Sep 5 06:03:09.800724 kernel: Key type fscrypt-provisioning registered Sep 5 06:03:09.800733 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 5 06:03:09.800741 kernel: ima: Allocated hash algorithm: sha1 Sep 5 06:03:09.800748 kernel: ima: No architecture policies found Sep 5 06:03:09.800755 kernel: clk: Disabling unused clocks Sep 5 06:03:09.800761 kernel: Warning: unable to open an initial console. Sep 5 06:03:09.800769 kernel: Freeing unused kernel image (initmem) memory: 54068K Sep 5 06:03:09.800778 kernel: Write protecting the kernel read-only data: 24576k Sep 5 06:03:09.800785 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 5 06:03:09.800793 kernel: Run /init as init process Sep 5 06:03:09.800801 kernel: with arguments: Sep 5 06:03:09.800808 kernel: /init Sep 5 06:03:09.800814 kernel: with environment: Sep 5 06:03:09.800822 kernel: HOME=/ Sep 5 06:03:09.800829 kernel: TERM=linux Sep 5 06:03:09.800837 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 5 06:03:09.800846 systemd[1]: Successfully made /usr/ read-only. Sep 5 06:03:09.800857 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 5 06:03:09.800868 systemd[1]: Detected virtualization vmware. Sep 5 06:03:09.800880 systemd[1]: Detected architecture x86-64. Sep 5 06:03:09.800889 systemd[1]: Running in initrd. Sep 5 06:03:09.800895 systemd[1]: No hostname configured, using default hostname. Sep 5 06:03:09.800905 systemd[1]: Hostname set to . Sep 5 06:03:09.800914 systemd[1]: Initializing machine ID from random generator. Sep 5 06:03:09.800922 systemd[1]: Queued start job for default target initrd.target. Sep 5 06:03:09.800929 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 06:03:09.800936 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 06:03:09.800943 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 5 06:03:09.800953 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 5 06:03:09.800962 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 5 06:03:09.800971 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 5 06:03:09.800982 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 5 06:03:09.800992 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 5 06:03:09.801001 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 06:03:09.801011 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 5 06:03:09.801021 systemd[1]: Reached target paths.target - Path Units. Sep 5 06:03:09.801032 systemd[1]: Reached target slices.target - Slice Units. Sep 5 06:03:09.801043 systemd[1]: Reached target swap.target - Swaps. Sep 5 06:03:09.801052 systemd[1]: Reached target timers.target - Timer Units. Sep 5 06:03:09.801060 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 5 06:03:09.801067 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 5 06:03:09.801073 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 5 06:03:09.801080 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 5 06:03:09.801090 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 5 06:03:09.801098 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 5 06:03:09.801108 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 06:03:09.801118 systemd[1]: Reached target sockets.target - Socket Units. Sep 5 06:03:09.801125 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 5 06:03:09.801135 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 5 06:03:09.801143 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 5 06:03:09.801150 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 5 06:03:09.801157 systemd[1]: Starting systemd-fsck-usr.service... Sep 5 06:03:09.801167 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 5 06:03:09.801175 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 5 06:03:09.801184 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:03:09.801194 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 5 06:03:09.801201 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 06:03:09.801211 systemd[1]: Finished systemd-fsck-usr.service. Sep 5 06:03:09.801218 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 5 06:03:09.801244 systemd-journald[244]: Collecting audit messages is disabled. Sep 5 06:03:09.801264 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 06:03:09.801276 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 5 06:03:09.801287 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:03:09.801298 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 5 06:03:09.801306 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 5 06:03:09.801315 kernel: Bridge firewalling registered Sep 5 06:03:09.801321 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 5 06:03:09.801328 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 5 06:03:09.801336 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 06:03:09.801345 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 06:03:09.801355 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 5 06:03:09.801365 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 5 06:03:09.801373 systemd-journald[244]: Journal started Sep 5 06:03:09.801388 systemd-journald[244]: Runtime Journal (/run/log/journal/65d9d7f4a7044e03bf9a6dc6ca0c78ad) is 4.8M, max 38.8M, 34M free. Sep 5 06:03:09.739548 systemd-modules-load[245]: Inserted module 'overlay' Sep 5 06:03:09.776528 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 5 06:03:09.804572 systemd[1]: Started systemd-journald.service - Journal Service. Sep 5 06:03:09.805878 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 5 06:03:09.812510 dracut-cmdline[270]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=4b2174e9c368fa97600991ce20efc370fbbf3ddfce3ea407f50212a1021bd496 Sep 5 06:03:09.818057 systemd-tmpfiles[272]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 5 06:03:09.821308 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 06:03:09.822514 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 5 06:03:09.857407 systemd-resolved[306]: Positive Trust Anchors: Sep 5 06:03:09.857418 systemd-resolved[306]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 5 06:03:09.857443 systemd-resolved[306]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 5 06:03:09.859501 systemd-resolved[306]: Defaulting to hostname 'linux'. Sep 5 06:03:09.860227 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 5 06:03:09.860401 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 5 06:03:09.882581 kernel: SCSI subsystem initialized Sep 5 06:03:09.905579 kernel: Loading iSCSI transport class v2.0-870. Sep 5 06:03:09.915582 kernel: iscsi: registered transport (tcp) Sep 5 06:03:09.943948 kernel: iscsi: registered transport (qla4xxx) Sep 5 06:03:09.944003 kernel: QLogic iSCSI HBA Driver Sep 5 06:03:09.956059 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 5 06:03:09.974526 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 5 06:03:09.975550 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 5 06:03:09.997934 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 5 06:03:09.998896 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 5 06:03:10.039582 kernel: raid6: avx2x4 gen() 43957 MB/s Sep 5 06:03:10.056579 kernel: raid6: avx2x2 gen() 46820 MB/s Sep 5 06:03:10.073869 kernel: raid6: avx2x1 gen() 32737 MB/s Sep 5 06:03:10.073917 kernel: raid6: using algorithm avx2x2 gen() 46820 MB/s Sep 5 06:03:10.091929 kernel: raid6: .... xor() 28055 MB/s, rmw enabled Sep 5 06:03:10.091986 kernel: raid6: using avx2x2 recovery algorithm Sep 5 06:03:10.108590 kernel: xor: automatically using best checksumming function avx Sep 5 06:03:10.218619 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 5 06:03:10.222104 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 5 06:03:10.223651 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 06:03:10.248647 systemd-udevd[492]: Using default interface naming scheme 'v255'. Sep 5 06:03:10.253347 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 06:03:10.254722 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 5 06:03:10.272670 dracut-pre-trigger[497]: rd.md=0: removing MD RAID activation Sep 5 06:03:10.288218 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 5 06:03:10.289384 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 5 06:03:10.379007 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 06:03:10.380653 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 5 06:03:10.456577 kernel: VMware PVSCSI driver - version 1.0.7.0-k Sep 5 06:03:10.467573 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Sep 5 06:03:10.471581 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Sep 5 06:03:10.474597 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Sep 5 06:03:10.480057 kernel: vmw_pvscsi: using 64bit dma Sep 5 06:03:10.480080 kernel: vmw_pvscsi: max_id: 16 Sep 5 06:03:10.480092 kernel: vmw_pvscsi: setting ring_pages to 8 Sep 5 06:03:10.486009 kernel: vmw_pvscsi: enabling reqCallThreshold Sep 5 06:03:10.486039 kernel: vmw_pvscsi: driver-based request coalescing enabled Sep 5 06:03:10.486051 kernel: vmw_pvscsi: using MSI-X Sep 5 06:03:10.489606 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Sep 5 06:03:10.497796 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Sep 5 06:03:10.497914 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Sep 5 06:03:10.497932 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Sep 5 06:03:10.499570 kernel: cryptd: max_cpu_qlen set to 1000 Sep 5 06:03:10.500566 kernel: libata version 3.00 loaded. Sep 5 06:03:10.503706 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 06:03:10.503957 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:03:10.504540 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:03:10.504685 (udev-worker)[550]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 5 06:03:10.506016 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:03:10.511690 kernel: ata_piix 0000:00:07.1: version 2.13 Sep 5 06:03:10.513702 kernel: scsi host1: ata_piix Sep 5 06:03:10.513795 kernel: scsi host2: ata_piix Sep 5 06:03:10.519069 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Sep 5 06:03:10.519093 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Sep 5 06:03:10.522630 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Sep 5 06:03:10.524101 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 5 06:03:10.524189 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Sep 5 06:03:10.524253 kernel: sd 0:0:0:0: [sda] Cache data unavailable Sep 5 06:03:10.524314 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Sep 5 06:03:10.527569 kernel: AES CTR mode by8 optimization enabled Sep 5 06:03:10.533295 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:03:10.536603 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Sep 5 06:03:10.574681 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 5 06:03:10.574727 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 5 06:03:10.692577 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Sep 5 06:03:10.698577 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Sep 5 06:03:10.725050 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Sep 5 06:03:10.725202 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 5 06:03:10.742574 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 5 06:03:10.748845 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 5 06:03:10.755163 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Sep 5 06:03:10.760059 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Sep 5 06:03:10.760334 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Sep 5 06:03:10.766216 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Sep 5 06:03:10.767024 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 5 06:03:10.872085 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 5 06:03:10.885585 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 5 06:03:11.064744 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 5 06:03:11.065103 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 5 06:03:11.065243 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 06:03:11.065437 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 5 06:03:11.066194 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 5 06:03:11.084323 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 5 06:03:11.882603 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 5 06:03:11.883580 disk-uuid[642]: The operation has completed successfully. Sep 5 06:03:11.920923 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 5 06:03:11.921136 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 5 06:03:11.938564 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 5 06:03:11.953773 sh[672]: Success Sep 5 06:03:11.981783 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 5 06:03:11.981810 kernel: device-mapper: uevent: version 1.0.3 Sep 5 06:03:11.983128 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 5 06:03:11.989575 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 5 06:03:12.065215 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 5 06:03:12.067619 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 5 06:03:12.077088 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 5 06:03:12.165585 kernel: BTRFS: device fsid 98069635-e988-4e04-b156-f40a4a69cf42 devid 1 transid 35 /dev/mapper/usr (254:0) scanned by mount (684) Sep 5 06:03:12.169055 kernel: BTRFS info (device dm-0): first mount of filesystem 98069635-e988-4e04-b156-f40a4a69cf42 Sep 5 06:03:12.169109 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 5 06:03:12.179930 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 5 06:03:12.180001 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 5 06:03:12.180021 kernel: BTRFS info (device dm-0): enabling free space tree Sep 5 06:03:12.184134 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 5 06:03:12.184567 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 5 06:03:12.185300 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Sep 5 06:03:12.186622 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 5 06:03:12.213571 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (707) Sep 5 06:03:12.213611 kernel: BTRFS info (device sda6): first mount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:03:12.213620 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 5 06:03:12.222263 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 5 06:03:12.222314 kernel: BTRFS info (device sda6): enabling free space tree Sep 5 06:03:12.227575 kernel: BTRFS info (device sda6): last unmount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:03:12.232322 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 5 06:03:12.233211 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 5 06:03:12.332659 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 5 06:03:12.334645 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 5 06:03:12.418944 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 5 06:03:12.419935 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 5 06:03:12.443079 systemd-networkd[862]: lo: Link UP Sep 5 06:03:12.443300 systemd-networkd[862]: lo: Gained carrier Sep 5 06:03:12.444138 systemd-networkd[862]: Enumeration completed Sep 5 06:03:12.444302 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 5 06:03:12.444448 systemd[1]: Reached target network.target - Network. Sep 5 06:03:12.444705 systemd-networkd[862]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Sep 5 06:03:12.448402 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 5 06:03:12.448506 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 5 06:03:12.446577 systemd-networkd[862]: ens192: Link UP Sep 5 06:03:12.446579 systemd-networkd[862]: ens192: Gained carrier Sep 5 06:03:12.469801 ignition[726]: Ignition 2.22.0 Sep 5 06:03:12.470051 ignition[726]: Stage: fetch-offline Sep 5 06:03:12.470078 ignition[726]: no configs at "/usr/lib/ignition/base.d" Sep 5 06:03:12.470084 ignition[726]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 5 06:03:12.470138 ignition[726]: parsed url from cmdline: "" Sep 5 06:03:12.470140 ignition[726]: no config URL provided Sep 5 06:03:12.470143 ignition[726]: reading system config file "/usr/lib/ignition/user.ign" Sep 5 06:03:12.470148 ignition[726]: no config at "/usr/lib/ignition/user.ign" Sep 5 06:03:12.470526 ignition[726]: config successfully fetched Sep 5 06:03:12.470547 ignition[726]: parsing config with SHA512: 689beacb469598211d6c2e866694d067ee74594c2293ae78cb2b8e321a26aa6c740dac56bd78a110a374ee57fa303d6915dbd29ee0692b0544c9c9e98a36fdde Sep 5 06:03:12.473748 unknown[726]: fetched base config from "system" Sep 5 06:03:12.473754 unknown[726]: fetched user config from "vmware" Sep 5 06:03:12.473963 ignition[726]: fetch-offline: fetch-offline passed Sep 5 06:03:12.473999 ignition[726]: Ignition finished successfully Sep 5 06:03:12.474955 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 5 06:03:12.475443 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 5 06:03:12.476062 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 5 06:03:12.496788 ignition[867]: Ignition 2.22.0 Sep 5 06:03:12.496801 ignition[867]: Stage: kargs Sep 5 06:03:12.496900 ignition[867]: no configs at "/usr/lib/ignition/base.d" Sep 5 06:03:12.496907 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 5 06:03:12.497427 ignition[867]: kargs: kargs passed Sep 5 06:03:12.497460 ignition[867]: Ignition finished successfully Sep 5 06:03:12.498891 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 5 06:03:12.499658 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 5 06:03:12.520986 ignition[874]: Ignition 2.22.0 Sep 5 06:03:12.521000 ignition[874]: Stage: disks Sep 5 06:03:12.521080 ignition[874]: no configs at "/usr/lib/ignition/base.d" Sep 5 06:03:12.521085 ignition[874]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 5 06:03:12.521704 ignition[874]: disks: disks passed Sep 5 06:03:12.521734 ignition[874]: Ignition finished successfully Sep 5 06:03:12.522653 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 5 06:03:12.523039 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 5 06:03:12.523151 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 5 06:03:12.523259 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 5 06:03:12.523350 systemd[1]: Reached target sysinit.target - System Initialization. Sep 5 06:03:12.523443 systemd[1]: Reached target basic.target - Basic System. Sep 5 06:03:12.524935 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 5 06:03:12.551456 systemd-fsck[882]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 5 06:03:12.553147 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 5 06:03:12.554429 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 5 06:03:12.738493 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 5 06:03:12.738790 kernel: EXT4-fs (sda9): mounted filesystem 5e58259f-916a-43e8-ae75-b44bea97e14e r/w with ordered data mode. Quota mode: none. Sep 5 06:03:12.739025 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 5 06:03:12.744026 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 5 06:03:12.745602 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 5 06:03:12.745904 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 5 06:03:12.745932 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 5 06:03:12.745951 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 5 06:03:12.758744 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 5 06:03:12.760125 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 5 06:03:12.768597 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (890) Sep 5 06:03:12.771101 kernel: BTRFS info (device sda6): first mount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:03:12.771136 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 5 06:03:12.777049 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 5 06:03:12.777095 kernel: BTRFS info (device sda6): enabling free space tree Sep 5 06:03:12.778745 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 5 06:03:12.878692 initrd-setup-root[914]: cut: /sysroot/etc/passwd: No such file or directory Sep 5 06:03:12.882905 initrd-setup-root[921]: cut: /sysroot/etc/group: No such file or directory Sep 5 06:03:12.888458 initrd-setup-root[928]: cut: /sysroot/etc/shadow: No such file or directory Sep 5 06:03:12.891001 initrd-setup-root[935]: cut: /sysroot/etc/gshadow: No such file or directory Sep 5 06:03:13.054851 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 5 06:03:13.055959 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 5 06:03:13.057633 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 5 06:03:13.074573 kernel: BTRFS info (device sda6): last unmount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:03:13.094164 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 5 06:03:13.097830 ignition[1003]: INFO : Ignition 2.22.0 Sep 5 06:03:13.097830 ignition[1003]: INFO : Stage: mount Sep 5 06:03:13.098245 ignition[1003]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 06:03:13.098245 ignition[1003]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 5 06:03:13.098759 ignition[1003]: INFO : mount: mount passed Sep 5 06:03:13.099582 ignition[1003]: INFO : Ignition finished successfully Sep 5 06:03:13.100312 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 5 06:03:13.101099 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 5 06:03:13.163460 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 5 06:03:13.164507 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 5 06:03:13.190585 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1014) Sep 5 06:03:13.190630 kernel: BTRFS info (device sda6): first mount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:03:13.192838 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 5 06:03:13.196644 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 5 06:03:13.196691 kernel: BTRFS info (device sda6): enabling free space tree Sep 5 06:03:13.198477 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 5 06:03:13.224274 ignition[1030]: INFO : Ignition 2.22.0 Sep 5 06:03:13.224274 ignition[1030]: INFO : Stage: files Sep 5 06:03:13.224274 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 06:03:13.224274 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 5 06:03:13.225320 ignition[1030]: DEBUG : files: compiled without relabeling support, skipping Sep 5 06:03:13.225861 ignition[1030]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 5 06:03:13.225861 ignition[1030]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 5 06:03:13.227375 ignition[1030]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 5 06:03:13.227675 ignition[1030]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 5 06:03:13.228170 unknown[1030]: wrote ssh authorized keys file for user: core Sep 5 06:03:13.228492 ignition[1030]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 5 06:03:13.232119 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 5 06:03:13.232119 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 5 06:03:13.440359 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 5 06:03:13.916593 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 5 06:03:13.916593 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 5 06:03:13.916593 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 5 06:03:13.916593 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 5 06:03:13.916593 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 5 06:03:13.916593 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 5 06:03:13.916593 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 5 06:03:13.917840 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 5 06:03:13.917840 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 5 06:03:13.923768 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 5 06:03:13.923943 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 5 06:03:13.923943 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 5 06:03:13.932422 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 5 06:03:13.932677 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 5 06:03:13.932677 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 5 06:03:14.285679 systemd-networkd[862]: ens192: Gained IPv6LL Sep 5 06:03:14.365621 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 5 06:03:14.692962 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 5 06:03:14.693278 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 5 06:03:14.693796 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 5 06:03:14.693796 ignition[1030]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Sep 5 06:03:14.694132 ignition[1030]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 5 06:03:14.694343 ignition[1030]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 5 06:03:14.694343 ignition[1030]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Sep 5 06:03:14.694343 ignition[1030]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Sep 5 06:03:14.694343 ignition[1030]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 5 06:03:14.694343 ignition[1030]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 5 06:03:14.694343 ignition[1030]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Sep 5 06:03:14.694343 ignition[1030]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Sep 5 06:03:14.794307 ignition[1030]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 5 06:03:14.796894 ignition[1030]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 5 06:03:14.797167 ignition[1030]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Sep 5 06:03:14.797167 ignition[1030]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Sep 5 06:03:14.797167 ignition[1030]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Sep 5 06:03:14.797167 ignition[1030]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 5 06:03:14.798620 ignition[1030]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 5 06:03:14.798620 ignition[1030]: INFO : files: files passed Sep 5 06:03:14.798620 ignition[1030]: INFO : Ignition finished successfully Sep 5 06:03:14.799096 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 5 06:03:14.800138 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 5 06:03:14.801629 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 5 06:03:14.814011 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 5 06:03:14.814074 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 5 06:03:14.818321 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 5 06:03:14.818321 initrd-setup-root-after-ignition[1062]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 5 06:03:14.819455 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 5 06:03:14.820381 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 5 06:03:14.820768 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 5 06:03:14.821464 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 5 06:03:14.850654 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 5 06:03:14.850732 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 5 06:03:14.851084 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 5 06:03:14.851306 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 5 06:03:14.851503 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 5 06:03:14.851949 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 5 06:03:14.860756 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 5 06:03:14.861467 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 5 06:03:14.875609 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 5 06:03:14.875929 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 06:03:14.876313 systemd[1]: Stopped target timers.target - Timer Units. Sep 5 06:03:14.876634 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 5 06:03:14.876837 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 5 06:03:14.877299 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 5 06:03:14.877610 systemd[1]: Stopped target basic.target - Basic System. Sep 5 06:03:14.877875 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 5 06:03:14.878163 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 5 06:03:14.878433 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 5 06:03:14.878749 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 5 06:03:14.879037 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 5 06:03:14.879290 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 5 06:03:14.879629 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 5 06:03:14.879922 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 5 06:03:14.880195 systemd[1]: Stopped target swap.target - Swaps. Sep 5 06:03:14.880431 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 5 06:03:14.880622 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 5 06:03:14.881007 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 5 06:03:14.881298 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 06:03:14.881605 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 5 06:03:14.881767 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 06:03:14.882058 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 5 06:03:14.882129 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 5 06:03:14.882563 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 5 06:03:14.882632 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 5 06:03:14.883052 systemd[1]: Stopped target paths.target - Path Units. Sep 5 06:03:14.883289 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 5 06:03:14.886574 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 06:03:14.886728 systemd[1]: Stopped target slices.target - Slice Units. Sep 5 06:03:14.886853 systemd[1]: Stopped target sockets.target - Socket Units. Sep 5 06:03:14.886971 systemd[1]: iscsid.socket: Deactivated successfully. Sep 5 06:03:14.887023 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 5 06:03:14.887151 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 5 06:03:14.887198 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 5 06:03:14.887354 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 5 06:03:14.887421 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 5 06:03:14.887587 systemd[1]: ignition-files.service: Deactivated successfully. Sep 5 06:03:14.887645 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 5 06:03:14.889646 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 5 06:03:14.890142 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 5 06:03:14.890242 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 5 06:03:14.890306 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 06:03:14.890458 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 5 06:03:14.890515 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 5 06:03:14.893319 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 5 06:03:14.895894 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 5 06:03:14.906166 ignition[1086]: INFO : Ignition 2.22.0 Sep 5 06:03:14.906479 ignition[1086]: INFO : Stage: umount Sep 5 06:03:14.906719 ignition[1086]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 06:03:14.906854 ignition[1086]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 5 06:03:14.906947 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 5 06:03:14.907802 ignition[1086]: INFO : umount: umount passed Sep 5 06:03:14.907953 ignition[1086]: INFO : Ignition finished successfully Sep 5 06:03:14.909488 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 5 06:03:14.909551 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 5 06:03:14.911041 systemd[1]: Stopped target network.target - Network. Sep 5 06:03:14.911161 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 5 06:03:14.911207 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 5 06:03:14.911355 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 5 06:03:14.911381 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 5 06:03:14.911508 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 5 06:03:14.911531 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 5 06:03:14.912886 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 5 06:03:14.912924 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 5 06:03:14.913418 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 5 06:03:14.913888 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 5 06:03:14.918770 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 5 06:03:14.919025 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 5 06:03:14.921395 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 5 06:03:14.921882 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 5 06:03:14.922099 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 5 06:03:14.923077 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 5 06:03:14.923810 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 5 06:03:14.924106 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 5 06:03:14.924292 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 5 06:03:14.925185 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 5 06:03:14.925440 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 5 06:03:14.925593 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 5 06:03:14.926098 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Sep 5 06:03:14.926257 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 5 06:03:14.926586 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 5 06:03:14.926616 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 5 06:03:14.926927 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 5 06:03:14.926955 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 5 06:03:14.927376 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 5 06:03:14.927400 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 06:03:14.928772 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 06:03:14.929925 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 5 06:03:14.929970 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 5 06:03:14.938839 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 5 06:03:14.938937 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 06:03:14.939319 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 5 06:03:14.939370 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 5 06:03:14.939574 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 5 06:03:14.939596 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 06:03:14.939855 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 5 06:03:14.939881 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 5 06:03:14.941237 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 5 06:03:14.941268 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 5 06:03:14.941632 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 5 06:03:14.941658 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 06:03:14.943034 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 5 06:03:14.943277 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 5 06:03:14.943406 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 5 06:03:14.944628 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 5 06:03:14.944756 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 06:03:14.945138 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 5 06:03:14.945194 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 06:03:14.945674 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 5 06:03:14.945810 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 06:03:14.946061 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 06:03:14.946089 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:03:14.947796 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 5 06:03:14.947961 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 5 06:03:14.947982 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 5 06:03:14.948003 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 5 06:03:14.948243 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 5 06:03:14.948305 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 5 06:03:14.951093 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 5 06:03:14.951286 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 5 06:03:14.990628 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 5 06:03:14.990703 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 5 06:03:14.990986 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 5 06:03:14.991105 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 5 06:03:14.991139 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 5 06:03:14.992527 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 5 06:03:15.013762 systemd[1]: Switching root. Sep 5 06:03:15.066757 systemd-journald[244]: Journal stopped Sep 5 06:03:18.417132 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 5 06:03:18.417162 kernel: SELinux: policy capability network_peer_controls=1 Sep 5 06:03:18.417171 kernel: SELinux: policy capability open_perms=1 Sep 5 06:03:18.417177 kernel: SELinux: policy capability extended_socket_class=1 Sep 5 06:03:18.417182 kernel: SELinux: policy capability always_check_network=0 Sep 5 06:03:18.417189 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 5 06:03:18.417196 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 5 06:03:18.417202 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 5 06:03:18.417207 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 5 06:03:18.417213 kernel: SELinux: policy capability userspace_initial_context=0 Sep 5 06:03:18.417219 kernel: audit: type=1403 audit(1757052196.305:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 5 06:03:18.417226 systemd[1]: Successfully loaded SELinux policy in 103.364ms. Sep 5 06:03:18.417234 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.939ms. Sep 5 06:03:18.417242 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 5 06:03:18.417249 systemd[1]: Detected virtualization vmware. Sep 5 06:03:18.417256 systemd[1]: Detected architecture x86-64. Sep 5 06:03:18.417264 systemd[1]: Detected first boot. Sep 5 06:03:18.417271 systemd[1]: Initializing machine ID from random generator. Sep 5 06:03:18.417279 zram_generator::config[1129]: No configuration found. Sep 5 06:03:18.417374 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Sep 5 06:03:18.417385 kernel: Guest personality initialized and is active Sep 5 06:03:18.417391 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 5 06:03:18.417397 kernel: Initialized host personality Sep 5 06:03:18.417405 kernel: NET: Registered PF_VSOCK protocol family Sep 5 06:03:18.417412 systemd[1]: Populated /etc with preset unit settings. Sep 5 06:03:18.417420 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 5 06:03:18.417427 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Sep 5 06:03:18.417434 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 5 06:03:18.417441 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 5 06:03:18.417447 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 5 06:03:18.417454 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 5 06:03:18.417462 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 5 06:03:18.417469 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 5 06:03:18.417475 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 5 06:03:18.417482 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 5 06:03:18.417489 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 5 06:03:18.417496 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 5 06:03:18.417504 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 5 06:03:18.417511 systemd[1]: Created slice user.slice - User and Session Slice. Sep 5 06:03:18.417518 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 06:03:18.417527 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 06:03:18.417534 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 5 06:03:18.417541 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 5 06:03:18.417548 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 5 06:03:18.417574 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 5 06:03:18.417585 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 5 06:03:18.417592 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 06:03:18.417599 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 5 06:03:18.417606 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 5 06:03:18.417612 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 5 06:03:18.417619 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 5 06:03:18.417626 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 5 06:03:18.417633 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 06:03:18.417641 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 5 06:03:18.417648 systemd[1]: Reached target slices.target - Slice Units. Sep 5 06:03:18.417655 systemd[1]: Reached target swap.target - Swaps. Sep 5 06:03:18.417662 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 5 06:03:18.417669 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 5 06:03:18.417677 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 5 06:03:18.417684 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 5 06:03:18.417691 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 5 06:03:18.417698 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 06:03:18.417705 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 5 06:03:18.417713 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 5 06:03:18.417720 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 5 06:03:18.417727 systemd[1]: Mounting media.mount - External Media Directory... Sep 5 06:03:18.417735 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:03:18.417742 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 5 06:03:18.417749 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 5 06:03:18.417756 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 5 06:03:18.417763 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 5 06:03:18.417770 systemd[1]: Reached target machines.target - Containers. Sep 5 06:03:18.417777 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 5 06:03:18.417784 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Sep 5 06:03:18.417793 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 5 06:03:18.417800 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 5 06:03:18.417807 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 06:03:18.417814 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 5 06:03:18.417821 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 06:03:18.417828 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 5 06:03:18.417835 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 06:03:18.417841 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 5 06:03:18.417849 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 5 06:03:18.417856 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 5 06:03:18.417864 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 5 06:03:18.417871 systemd[1]: Stopped systemd-fsck-usr.service. Sep 5 06:03:18.417880 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 5 06:03:18.417887 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 5 06:03:18.417894 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 5 06:03:18.417901 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 5 06:03:18.417908 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 5 06:03:18.417917 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 5 06:03:18.417924 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 5 06:03:18.417931 systemd[1]: verity-setup.service: Deactivated successfully. Sep 5 06:03:18.417938 systemd[1]: Stopped verity-setup.service. Sep 5 06:03:18.417945 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:03:18.417952 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 5 06:03:18.417959 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 5 06:03:18.417966 systemd[1]: Mounted media.mount - External Media Directory. Sep 5 06:03:18.417974 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 5 06:03:18.417981 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 5 06:03:18.417988 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 5 06:03:18.417995 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 06:03:18.418002 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 5 06:03:18.418009 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 5 06:03:18.418016 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 06:03:18.418023 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 06:03:18.418030 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 06:03:18.418039 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 06:03:18.418046 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 5 06:03:18.418053 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 5 06:03:18.418060 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 5 06:03:18.418066 kernel: fuse: init (API version 7.41) Sep 5 06:03:18.418073 kernel: loop: module loaded Sep 5 06:03:18.418079 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 5 06:03:18.418087 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 5 06:03:18.418095 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 5 06:03:18.418102 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 06:03:18.418111 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 06:03:18.418120 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 5 06:03:18.418127 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 5 06:03:18.418134 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 5 06:03:18.418142 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 5 06:03:18.418150 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 5 06:03:18.418157 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 5 06:03:18.418164 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 5 06:03:18.418173 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 06:03:18.418198 systemd-journald[1215]: Collecting audit messages is disabled. Sep 5 06:03:18.418216 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 5 06:03:18.418225 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 06:03:18.418233 systemd-journald[1215]: Journal started Sep 5 06:03:18.418248 systemd-journald[1215]: Runtime Journal (/run/log/journal/fbbf65f1d8f942c1a5964e9c2d6e6928) is 4.8M, max 38.8M, 34M free. Sep 5 06:03:18.161582 systemd[1]: Queued start job for default target multi-user.target. Sep 5 06:03:18.177013 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 5 06:03:18.177294 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 5 06:03:18.425257 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 5 06:03:18.425289 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 06:03:18.425528 jq[1199]: true Sep 5 06:03:18.426075 jq[1226]: true Sep 5 06:03:18.431001 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 5 06:03:18.431043 systemd[1]: Started systemd-journald.service - Journal Service. Sep 5 06:03:18.432631 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 5 06:03:18.432929 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 5 06:03:18.433099 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 5 06:03:18.447227 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 5 06:03:18.448932 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 5 06:03:18.454845 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 5 06:03:18.460592 kernel: ACPI: bus type drm_connector registered Sep 5 06:03:18.469969 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 5 06:03:18.470485 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 5 06:03:18.471443 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 5 06:03:18.474702 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 5 06:03:18.474993 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 5 06:03:18.478124 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 5 06:03:18.478784 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Sep 5 06:03:18.478804 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Sep 5 06:03:18.486695 ignition[1239]: Ignition 2.22.0 Sep 5 06:03:18.488828 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 06:03:18.486899 ignition[1239]: deleting config from guestinfo properties Sep 5 06:03:18.491673 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 5 06:03:18.535190 ignition[1239]: Successfully deleted config Sep 5 06:03:18.536755 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Sep 5 06:03:18.545010 kernel: loop0: detected capacity change from 0 to 128016 Sep 5 06:03:18.546623 systemd-journald[1215]: Time spent on flushing to /var/log/journal/fbbf65f1d8f942c1a5964e9c2d6e6928 is 37.097ms for 1779 entries. Sep 5 06:03:18.546623 systemd-journald[1215]: System Journal (/var/log/journal/fbbf65f1d8f942c1a5964e9c2d6e6928) is 8M, max 584.8M, 576.8M free. Sep 5 06:03:18.591735 systemd-journald[1215]: Received client request to flush runtime journal. Sep 5 06:03:18.562871 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 06:03:18.592473 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 5 06:03:18.668734 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 5 06:03:18.700651 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 5 06:03:18.702187 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 5 06:03:18.705987 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 5 06:03:18.719728 systemd-tmpfiles[1301]: ACLs are not supported, ignoring. Sep 5 06:03:18.719741 systemd-tmpfiles[1301]: ACLs are not supported, ignoring. Sep 5 06:03:18.721735 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 06:03:18.724570 kernel: loop1: detected capacity change from 0 to 2960 Sep 5 06:03:18.892577 kernel: loop2: detected capacity change from 0 to 111000 Sep 5 06:03:19.077576 kernel: loop3: detected capacity change from 0 to 224512 Sep 5 06:03:19.178274 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 5 06:03:19.301587 kernel: loop4: detected capacity change from 0 to 128016 Sep 5 06:03:19.525593 kernel: loop5: detected capacity change from 0 to 2960 Sep 5 06:03:19.535258 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 5 06:03:19.536693 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 06:03:19.565136 systemd-udevd[1309]: Using default interface naming scheme 'v255'. Sep 5 06:03:19.586034 kernel: loop6: detected capacity change from 0 to 111000 Sep 5 06:03:19.639371 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 06:03:19.642649 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 5 06:03:19.675643 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 5 06:03:19.682577 kernel: loop7: detected capacity change from 0 to 224512 Sep 5 06:03:19.683970 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 5 06:03:19.744932 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 5 06:03:19.800694 (sd-merge)[1307]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Sep 5 06:03:19.801752 (sd-merge)[1307]: Merged extensions into '/usr'. Sep 5 06:03:19.805572 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 5 06:03:19.819588 kernel: ACPI: button: Power Button [PWRF] Sep 5 06:03:19.822831 systemd[1]: Reload requested from client PID 1253 ('systemd-sysext') (unit systemd-sysext.service)... Sep 5 06:03:19.822842 systemd[1]: Reloading... Sep 5 06:03:19.826440 kernel: mousedev: PS/2 mouse device common for all mice Sep 5 06:03:19.863576 zram_generator::config[1376]: No configuration found. Sep 5 06:03:19.876318 systemd-networkd[1316]: lo: Link UP Sep 5 06:03:19.876323 systemd-networkd[1316]: lo: Gained carrier Sep 5 06:03:19.879047 systemd-networkd[1316]: Enumeration completed Sep 5 06:03:19.879270 systemd-networkd[1316]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Sep 5 06:03:19.883604 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 5 06:03:19.883765 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 5 06:03:19.888692 systemd-networkd[1316]: ens192: Link UP Sep 5 06:03:19.888789 systemd-networkd[1316]: ens192: Gained carrier Sep 5 06:03:19.991577 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Sep 5 06:03:20.018272 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 5 06:03:20.032596 (udev-worker)[1315]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 5 06:03:20.084452 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 5 06:03:20.084840 systemd[1]: Reloading finished in 261 ms. Sep 5 06:03:20.094658 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 5 06:03:20.095328 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 5 06:03:20.108983 systemd[1]: Starting ensure-sysext.service... Sep 5 06:03:20.111664 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 5 06:03:20.113695 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 5 06:03:20.118409 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 5 06:03:20.122674 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 5 06:03:20.132764 systemd[1]: Reload requested from client PID 1450 ('systemctl') (unit ensure-sysext.service)... Sep 5 06:03:20.132773 systemd[1]: Reloading... Sep 5 06:03:20.150925 systemd-tmpfiles[1454]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 5 06:03:20.151677 systemd-tmpfiles[1454]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 5 06:03:20.151902 systemd-tmpfiles[1454]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 5 06:03:20.152109 systemd-tmpfiles[1454]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 5 06:03:20.153685 systemd-tmpfiles[1454]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 5 06:03:20.153868 systemd-tmpfiles[1454]: ACLs are not supported, ignoring. Sep 5 06:03:20.153903 systemd-tmpfiles[1454]: ACLs are not supported, ignoring. Sep 5 06:03:20.183622 zram_generator::config[1488]: No configuration found. Sep 5 06:03:20.186512 systemd-tmpfiles[1454]: Detected autofs mount point /boot during canonicalization of boot. Sep 5 06:03:20.186518 systemd-tmpfiles[1454]: Skipping /boot Sep 5 06:03:20.192209 systemd-tmpfiles[1454]: Detected autofs mount point /boot during canonicalization of boot. Sep 5 06:03:20.192216 systemd-tmpfiles[1454]: Skipping /boot Sep 5 06:03:20.274173 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 5 06:03:20.339180 systemd[1]: Reloading finished in 206 ms. Sep 5 06:03:20.376059 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 5 06:03:20.376485 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 5 06:03:20.376902 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 06:03:20.384664 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:03:20.385553 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 5 06:03:20.403026 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 5 06:03:20.403798 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 06:03:20.406438 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 06:03:20.408506 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 06:03:20.408756 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 06:03:20.408832 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 5 06:03:20.412421 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 5 06:03:20.415772 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 5 06:03:20.417709 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 5 06:03:20.419496 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:03:20.421458 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:03:20.423488 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 06:03:20.424672 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 06:03:20.425084 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 06:03:20.425262 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 06:03:20.425852 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 06:03:20.425965 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 06:03:20.430225 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:03:20.432822 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 06:03:20.433770 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 06:03:20.434707 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 06:03:20.434855 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 06:03:20.434924 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 5 06:03:20.434983 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:03:20.443076 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:03:20.447429 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 5 06:03:20.447627 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 06:03:20.447651 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 5 06:03:20.447700 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:03:20.448039 systemd[1]: Finished ensure-sysext.service. Sep 5 06:03:20.448306 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 5 06:03:20.457422 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 5 06:03:20.457708 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 06:03:20.457845 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 06:03:20.458088 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 06:03:20.462998 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 5 06:03:20.463771 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 5 06:03:20.465497 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 06:03:20.466653 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 06:03:20.466978 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 06:03:20.467200 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 06:03:20.468179 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 06:03:20.499530 systemd-resolved[1554]: Positive Trust Anchors: Sep 5 06:03:20.500094 systemd-resolved[1554]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 5 06:03:20.500149 systemd-resolved[1554]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 5 06:03:20.519776 systemd-resolved[1554]: Defaulting to hostname 'linux'. Sep 5 06:03:20.552984 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 5 06:03:20.553258 systemd[1]: Reached target time-set.target - System Time Set. Sep 5 06:03:20.554092 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 5 06:03:20.554245 systemd[1]: Reached target network.target - Network. Sep 5 06:03:20.554334 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 5 06:03:20.559465 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 5 06:03:20.564457 augenrules[1595]: No rules Sep 5 06:03:20.565139 systemd[1]: audit-rules.service: Deactivated successfully. Sep 5 06:03:20.565346 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 5 06:03:20.574638 ldconfig[1245]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 5 06:03:20.577641 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 5 06:03:20.578820 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 5 06:03:20.592769 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 5 06:03:20.604485 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:03:20.617365 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 5 06:03:20.617709 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 5 06:03:20.617781 systemd[1]: Reached target sysinit.target - System Initialization. Sep 5 06:03:20.617976 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 5 06:03:20.618136 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 5 06:03:20.618287 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 5 06:03:20.618505 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 5 06:03:20.618694 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 5 06:03:20.618839 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 5 06:03:20.618980 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 5 06:03:20.619033 systemd[1]: Reached target paths.target - Path Units. Sep 5 06:03:20.619151 systemd[1]: Reached target timers.target - Timer Units. Sep 5 06:03:20.619814 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 5 06:03:20.621054 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 5 06:03:20.622506 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 5 06:03:20.622772 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 5 06:03:20.622933 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 5 06:03:20.626545 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 5 06:03:20.627148 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 5 06:03:20.627927 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 5 06:03:20.628775 systemd[1]: Reached target sockets.target - Socket Units. Sep 5 06:03:20.629049 systemd[1]: Reached target basic.target - Basic System. Sep 5 06:03:20.629261 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 5 06:03:20.629347 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 5 06:03:20.630591 systemd[1]: Starting containerd.service - containerd container runtime... Sep 5 06:03:20.633664 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 5 06:03:20.638665 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 5 06:03:20.639616 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 5 06:03:20.641752 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 5 06:03:20.641882 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 5 06:03:20.644668 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 5 06:03:20.647673 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 5 06:03:20.651525 jq[1613]: false Sep 5 06:03:20.650607 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 5 06:03:20.654761 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 5 06:03:20.657823 extend-filesystems[1614]: Found /dev/sda6 Sep 5 06:03:20.659205 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 5 06:03:20.663320 extend-filesystems[1614]: Found /dev/sda9 Sep 5 06:03:20.662895 oslogin_cache_refresh[1615]: Refreshing passwd entry cache Sep 5 06:03:20.665550 google_oslogin_nss_cache[1615]: oslogin_cache_refresh[1615]: Refreshing passwd entry cache Sep 5 06:03:20.664232 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 5 06:03:20.664876 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 5 06:03:20.665422 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 5 06:03:20.665939 extend-filesystems[1614]: Checking size of /dev/sda9 Sep 5 06:03:20.668749 systemd[1]: Starting update-engine.service - Update Engine... Sep 5 06:03:20.674369 oslogin_cache_refresh[1615]: Failure getting users, quitting Sep 5 06:03:20.674760 google_oslogin_nss_cache[1615]: oslogin_cache_refresh[1615]: Failure getting users, quitting Sep 5 06:03:20.674760 google_oslogin_nss_cache[1615]: oslogin_cache_refresh[1615]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 5 06:03:20.674760 google_oslogin_nss_cache[1615]: oslogin_cache_refresh[1615]: Refreshing group entry cache Sep 5 06:03:20.674385 oslogin_cache_refresh[1615]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 5 06:03:20.674425 oslogin_cache_refresh[1615]: Refreshing group entry cache Sep 5 06:03:20.677207 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 5 06:03:20.679752 google_oslogin_nss_cache[1615]: oslogin_cache_refresh[1615]: Failure getting groups, quitting Sep 5 06:03:20.679752 google_oslogin_nss_cache[1615]: oslogin_cache_refresh[1615]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 5 06:03:20.679744 oslogin_cache_refresh[1615]: Failure getting groups, quitting Sep 5 06:03:20.679753 oslogin_cache_refresh[1615]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 5 06:03:20.680824 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Sep 5 06:03:20.683842 extend-filesystems[1614]: Old size kept for /dev/sda9 Sep 5 06:03:20.683425 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 5 06:03:20.683840 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 5 06:03:20.683967 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 5 06:03:20.684110 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 5 06:03:20.684224 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 5 06:03:20.684705 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 5 06:03:20.685604 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 5 06:03:20.688383 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 5 06:03:20.688518 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 5 06:03:20.711576 jq[1632]: true Sep 5 06:03:20.713666 update_engine[1628]: I20250905 06:03:20.713610 1628 main.cc:92] Flatcar Update Engine starting Sep 5 06:03:20.715671 systemd[1]: motdgen.service: Deactivated successfully. Sep 5 06:03:20.715872 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 5 06:03:20.727572 tar[1638]: linux-amd64/LICENSE Sep 5 06:03:20.727572 tar[1638]: linux-amd64/helm Sep 5 06:03:20.727825 (ntainerd)[1653]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 5 06:03:20.739465 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Sep 5 06:03:20.741593 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Sep 5 06:03:20.744946 jq[1656]: true Sep 5 06:03:20.762111 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 5 06:03:20.762014 dbus-daemon[1611]: [system] SELinux support is enabled Sep 5 06:03:20.772520 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 5 06:03:20.772575 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 5 06:03:20.772744 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 5 06:03:20.772753 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 5 06:03:20.775886 systemd[1]: Started update-engine.service - Update Engine. Sep 5 06:03:20.776666 update_engine[1628]: I20250905 06:03:20.776098 1628 update_check_scheduler.cc:74] Next update check in 7m2s Sep 5 06:03:20.796833 systemd-logind[1625]: Watching system buttons on /dev/input/event2 (Power Button) Sep 5 06:03:20.796848 systemd-logind[1625]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 5 06:03:20.798458 systemd-logind[1625]: New seat seat0. Sep 5 06:03:20.805711 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 5 06:03:20.806063 systemd[1]: Started systemd-logind.service - User Login Management. Sep 5 06:03:20.806803 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Sep 5 06:03:20.854350 unknown[1660]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Sep 5 06:03:20.857941 unknown[1660]: Core dump limit set to -1 Sep 5 06:03:20.865404 bash[1682]: Updated "/home/core/.ssh/authorized_keys" Sep 5 06:03:20.867514 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 5 06:03:20.868251 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 5 06:04:52.820342 systemd-timesyncd[1575]: Contacted time server 139.94.144.123:123 (0.flatcar.pool.ntp.org). Sep 5 06:04:52.820383 systemd-timesyncd[1575]: Initial clock synchronization to Fri 2025-09-05 06:04:52.820225 UTC. Sep 5 06:04:52.820464 systemd-resolved[1554]: Clock change detected. Flushing caches. Sep 5 06:04:52.901284 locksmithd[1668]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 5 06:04:52.946782 sshd_keygen[1650]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 5 06:04:52.958603 containerd[1653]: time="2025-09-05T06:04:52Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 5 06:04:52.960083 containerd[1653]: time="2025-09-05T06:04:52.960058080Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 5 06:04:52.973391 containerd[1653]: time="2025-09-05T06:04:52.973356360Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.213µs" Sep 5 06:04:52.973391 containerd[1653]: time="2025-09-05T06:04:52.973384314Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 5 06:04:52.973391 containerd[1653]: time="2025-09-05T06:04:52.973399610Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 5 06:04:52.973630 containerd[1653]: time="2025-09-05T06:04:52.973508544Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 5 06:04:52.973630 containerd[1653]: time="2025-09-05T06:04:52.973521797Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 5 06:04:52.973630 containerd[1653]: time="2025-09-05T06:04:52.973536882Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973630 containerd[1653]: time="2025-09-05T06:04:52.973572948Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973630 containerd[1653]: time="2025-09-05T06:04:52.973580942Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973771 containerd[1653]: time="2025-09-05T06:04:52.973728098Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973771 containerd[1653]: time="2025-09-05T06:04:52.973738075Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973771 containerd[1653]: time="2025-09-05T06:04:52.973744540Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973771 containerd[1653]: time="2025-09-05T06:04:52.973749158Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973833 containerd[1653]: time="2025-09-05T06:04:52.973787739Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973927 containerd[1653]: time="2025-09-05T06:04:52.973915095Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973949 containerd[1653]: time="2025-09-05T06:04:52.973933758Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 5 06:04:52.973949 containerd[1653]: time="2025-09-05T06:04:52.973939803Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 5 06:04:52.973983 containerd[1653]: time="2025-09-05T06:04:52.973958539Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 5 06:04:52.974090 containerd[1653]: time="2025-09-05T06:04:52.974080560Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 5 06:04:52.974126 containerd[1653]: time="2025-09-05T06:04:52.974113656Z" level=info msg="metadata content store policy set" policy=shared Sep 5 06:04:52.977359 containerd[1653]: time="2025-09-05T06:04:52.977289743Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 5 06:04:52.977359 containerd[1653]: time="2025-09-05T06:04:52.977338425Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 5 06:04:52.977359 containerd[1653]: time="2025-09-05T06:04:52.977350413Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 5 06:04:52.977359 containerd[1653]: time="2025-09-05T06:04:52.977357945Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 5 06:04:52.977359 containerd[1653]: time="2025-09-05T06:04:52.977365154Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 5 06:04:52.977495 containerd[1653]: time="2025-09-05T06:04:52.977370972Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 5 06:04:52.977495 containerd[1653]: time="2025-09-05T06:04:52.977379182Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 5 06:04:52.977495 containerd[1653]: time="2025-09-05T06:04:52.977407529Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 5 06:04:52.977495 containerd[1653]: time="2025-09-05T06:04:52.977420479Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 5 06:04:52.977495 containerd[1653]: time="2025-09-05T06:04:52.977427208Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 5 06:04:52.977495 containerd[1653]: time="2025-09-05T06:04:52.977432323Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 5 06:04:52.977495 containerd[1653]: time="2025-09-05T06:04:52.977439490Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977522848Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977535024Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977544338Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977551343Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977557001Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977562725Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977568760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977574936Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 5 06:04:52.977583 containerd[1653]: time="2025-09-05T06:04:52.977581035Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 5 06:04:52.977696 containerd[1653]: time="2025-09-05T06:04:52.977586622Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 5 06:04:52.977696 containerd[1653]: time="2025-09-05T06:04:52.977593582Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 5 06:04:52.977696 containerd[1653]: time="2025-09-05T06:04:52.977644905Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 5 06:04:52.977696 containerd[1653]: time="2025-09-05T06:04:52.977653913Z" level=info msg="Start snapshots syncer" Sep 5 06:04:52.977696 containerd[1653]: time="2025-09-05T06:04:52.977674548Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 5 06:04:52.978154 containerd[1653]: time="2025-09-05T06:04:52.977872546Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 5 06:04:52.978154 containerd[1653]: time="2025-09-05T06:04:52.977909136Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.977964433Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978048331Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978061366Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978067670Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978074583Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978082759Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978088711Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978094477Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978109652Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978116068Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 5 06:04:52.978282 containerd[1653]: time="2025-09-05T06:04:52.978126262Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978495668Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978513154Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978518938Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978524837Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978529746Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978535880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978580995Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978592647Z" level=info msg="runtime interface created" Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978595737Z" level=info msg="created NRI interface" Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978601172Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978608534Z" level=info msg="Connect containerd service" Sep 5 06:04:52.978968 containerd[1653]: time="2025-09-05T06:04:52.978624578Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 5 06:04:52.979133 containerd[1653]: time="2025-09-05T06:04:52.979031781Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 5 06:04:52.986661 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 5 06:04:52.989721 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 5 06:04:53.029908 systemd[1]: issuegen.service: Deactivated successfully. Sep 5 06:04:53.030058 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 5 06:04:53.032390 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 5 06:04:53.059722 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 5 06:04:53.062476 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 5 06:04:53.064152 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 5 06:04:53.064463 systemd[1]: Reached target getty.target - Login Prompts. Sep 5 06:04:53.102334 containerd[1653]: time="2025-09-05T06:04:53.102307447Z" level=info msg="Start subscribing containerd event" Sep 5 06:04:53.102641 containerd[1653]: time="2025-09-05T06:04:53.102469921Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 5 06:04:53.102730 containerd[1653]: time="2025-09-05T06:04:53.102721438Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 5 06:04:53.102901 containerd[1653]: time="2025-09-05T06:04:53.102872965Z" level=info msg="Start recovering state" Sep 5 06:04:53.102949 containerd[1653]: time="2025-09-05T06:04:53.102940647Z" level=info msg="Start event monitor" Sep 5 06:04:53.102973 containerd[1653]: time="2025-09-05T06:04:53.102951292Z" level=info msg="Start cni network conf syncer for default" Sep 5 06:04:53.102973 containerd[1653]: time="2025-09-05T06:04:53.102957340Z" level=info msg="Start streaming server" Sep 5 06:04:53.102973 containerd[1653]: time="2025-09-05T06:04:53.102961954Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 5 06:04:53.102973 containerd[1653]: time="2025-09-05T06:04:53.102966018Z" level=info msg="runtime interface starting up..." Sep 5 06:04:53.102973 containerd[1653]: time="2025-09-05T06:04:53.102968952Z" level=info msg="starting plugins..." Sep 5 06:04:53.103047 containerd[1653]: time="2025-09-05T06:04:53.102976461Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 5 06:04:53.103102 systemd[1]: Started containerd.service - containerd container runtime. Sep 5 06:04:53.103258 containerd[1653]: time="2025-09-05T06:04:53.103248323Z" level=info msg="containerd successfully booted in 0.144948s" Sep 5 06:04:53.105663 tar[1638]: linux-amd64/README.md Sep 5 06:04:53.132212 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 5 06:04:53.161273 systemd-networkd[1316]: ens192: Gained IPv6LL Sep 5 06:04:53.163036 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 5 06:04:53.163653 systemd[1]: Reached target network-online.target - Network is Online. Sep 5 06:04:53.164947 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Sep 5 06:04:53.168130 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:04:53.172244 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 5 06:04:53.194203 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 5 06:04:53.208205 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 5 06:04:53.208390 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Sep 5 06:04:53.208740 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 5 06:04:54.175559 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:04:54.175958 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 5 06:04:54.176812 systemd[1]: Startup finished in 2.876s (kernel) + 6.655s (initrd) + 6.073s (userspace) = 15.605s. Sep 5 06:04:54.180889 (kubelet)[1810]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:04:54.207678 login[1769]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 5 06:04:54.207862 login[1768]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 5 06:04:54.216937 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 5 06:04:54.218662 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 5 06:04:54.220950 systemd-logind[1625]: New session 1 of user core. Sep 5 06:04:54.223778 systemd-logind[1625]: New session 2 of user core. Sep 5 06:04:54.234502 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 5 06:04:54.236328 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 5 06:04:54.243652 (systemd)[1817]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 5 06:04:54.245079 systemd-logind[1625]: New session c1 of user core. Sep 5 06:04:54.333069 systemd[1817]: Queued start job for default target default.target. Sep 5 06:04:54.338950 systemd[1817]: Created slice app.slice - User Application Slice. Sep 5 06:04:54.339062 systemd[1817]: Reached target paths.target - Paths. Sep 5 06:04:54.339133 systemd[1817]: Reached target timers.target - Timers. Sep 5 06:04:54.339878 systemd[1817]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 5 06:04:54.346880 systemd[1817]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 5 06:04:54.346976 systemd[1817]: Reached target sockets.target - Sockets. Sep 5 06:04:54.347041 systemd[1817]: Reached target basic.target - Basic System. Sep 5 06:04:54.347608 systemd[1817]: Reached target default.target - Main User Target. Sep 5 06:04:54.347623 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 5 06:04:54.347702 systemd[1817]: Startup finished in 98ms. Sep 5 06:04:54.349349 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 5 06:04:54.350359 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 5 06:04:54.656494 kubelet[1810]: E0905 06:04:54.656451 1810 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:04:54.658115 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:04:54.658214 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:04:54.658527 systemd[1]: kubelet.service: Consumed 634ms CPU time, 264.8M memory peak. Sep 5 06:05:04.829783 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 5 06:05:04.831867 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:05:05.278984 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:05.288444 (kubelet)[1864]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:05:05.374503 kubelet[1864]: E0905 06:05:05.374466 1864 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:05:05.377081 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:05:05.377264 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:05:05.377633 systemd[1]: kubelet.service: Consumed 104ms CPU time, 108.5M memory peak. Sep 5 06:05:15.579725 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 5 06:05:15.581457 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:05:15.783816 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:15.786274 (kubelet)[1879]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:05:15.819942 kubelet[1879]: E0905 06:05:15.819904 1879 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:05:15.821315 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:05:15.821452 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:05:15.821810 systemd[1]: kubelet.service: Consumed 98ms CPU time, 108.2M memory peak. Sep 5 06:05:22.841774 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 5 06:05:22.844044 systemd[1]: Started sshd@0-139.178.70.103:22-139.178.89.65:42642.service - OpenSSH per-connection server daemon (139.178.89.65:42642). Sep 5 06:05:22.903422 sshd[1887]: Accepted publickey for core from 139.178.89.65 port 42642 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:05:22.903691 sshd-session[1887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:05:22.906363 systemd-logind[1625]: New session 3 of user core. Sep 5 06:05:22.914263 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 5 06:05:22.969331 systemd[1]: Started sshd@1-139.178.70.103:22-139.178.89.65:42644.service - OpenSSH per-connection server daemon (139.178.89.65:42644). Sep 5 06:05:23.010356 sshd[1893]: Accepted publickey for core from 139.178.89.65 port 42644 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:05:23.010805 sshd-session[1893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:05:23.013361 systemd-logind[1625]: New session 4 of user core. Sep 5 06:05:23.027604 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 5 06:05:23.075131 sshd[1896]: Connection closed by 139.178.89.65 port 42644 Sep 5 06:05:23.076003 sshd-session[1893]: pam_unix(sshd:session): session closed for user core Sep 5 06:05:23.080721 systemd[1]: sshd@1-139.178.70.103:22-139.178.89.65:42644.service: Deactivated successfully. Sep 5 06:05:23.081721 systemd[1]: session-4.scope: Deactivated successfully. Sep 5 06:05:23.082332 systemd-logind[1625]: Session 4 logged out. Waiting for processes to exit. Sep 5 06:05:23.083972 systemd[1]: Started sshd@2-139.178.70.103:22-139.178.89.65:42648.service - OpenSSH per-connection server daemon (139.178.89.65:42648). Sep 5 06:05:23.084616 systemd-logind[1625]: Removed session 4. Sep 5 06:05:23.122066 sshd[1902]: Accepted publickey for core from 139.178.89.65 port 42648 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:05:23.122805 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:05:23.126261 systemd-logind[1625]: New session 5 of user core. Sep 5 06:05:23.132272 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 5 06:05:23.177579 sshd[1905]: Connection closed by 139.178.89.65 port 42648 Sep 5 06:05:23.178515 sshd-session[1902]: pam_unix(sshd:session): session closed for user core Sep 5 06:05:23.189857 systemd[1]: sshd@2-139.178.70.103:22-139.178.89.65:42648.service: Deactivated successfully. Sep 5 06:05:23.190965 systemd[1]: session-5.scope: Deactivated successfully. Sep 5 06:05:23.191919 systemd-logind[1625]: Session 5 logged out. Waiting for processes to exit. Sep 5 06:05:23.193088 systemd[1]: Started sshd@3-139.178.70.103:22-139.178.89.65:42664.service - OpenSSH per-connection server daemon (139.178.89.65:42664). Sep 5 06:05:23.194312 systemd-logind[1625]: Removed session 5. Sep 5 06:05:23.238258 sshd[1911]: Accepted publickey for core from 139.178.89.65 port 42664 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:05:23.239160 sshd-session[1911]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:05:23.243053 systemd-logind[1625]: New session 6 of user core. Sep 5 06:05:23.254344 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 5 06:05:23.303272 sshd[1914]: Connection closed by 139.178.89.65 port 42664 Sep 5 06:05:23.303616 sshd-session[1911]: pam_unix(sshd:session): session closed for user core Sep 5 06:05:23.313772 systemd[1]: sshd@3-139.178.70.103:22-139.178.89.65:42664.service: Deactivated successfully. Sep 5 06:05:23.314841 systemd[1]: session-6.scope: Deactivated successfully. Sep 5 06:05:23.315679 systemd-logind[1625]: Session 6 logged out. Waiting for processes to exit. Sep 5 06:05:23.317452 systemd[1]: Started sshd@4-139.178.70.103:22-139.178.89.65:42672.service - OpenSSH per-connection server daemon (139.178.89.65:42672). Sep 5 06:05:23.318342 systemd-logind[1625]: Removed session 6. Sep 5 06:05:23.372507 sshd[1920]: Accepted publickey for core from 139.178.89.65 port 42672 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:05:23.373772 sshd-session[1920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:05:23.377779 systemd-logind[1625]: New session 7 of user core. Sep 5 06:05:23.387274 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 5 06:05:23.446055 sudo[1924]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 5 06:05:23.446515 sudo[1924]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 06:05:23.456633 sudo[1924]: pam_unix(sudo:session): session closed for user root Sep 5 06:05:23.457485 sshd[1923]: Connection closed by 139.178.89.65 port 42672 Sep 5 06:05:23.458297 sshd-session[1920]: pam_unix(sshd:session): session closed for user core Sep 5 06:05:23.464705 systemd[1]: sshd@4-139.178.70.103:22-139.178.89.65:42672.service: Deactivated successfully. Sep 5 06:05:23.465786 systemd[1]: session-7.scope: Deactivated successfully. Sep 5 06:05:23.466445 systemd-logind[1625]: Session 7 logged out. Waiting for processes to exit. Sep 5 06:05:23.468432 systemd[1]: Started sshd@5-139.178.70.103:22-139.178.89.65:42684.service - OpenSSH per-connection server daemon (139.178.89.65:42684). Sep 5 06:05:23.469239 systemd-logind[1625]: Removed session 7. Sep 5 06:05:23.514319 sshd[1930]: Accepted publickey for core from 139.178.89.65 port 42684 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:05:23.515285 sshd-session[1930]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:05:23.518593 systemd-logind[1625]: New session 8 of user core. Sep 5 06:05:23.523259 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 5 06:05:23.572652 sudo[1935]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 5 06:05:23.572854 sudo[1935]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 06:05:23.575895 sudo[1935]: pam_unix(sudo:session): session closed for user root Sep 5 06:05:23.580033 sudo[1934]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 5 06:05:23.580398 sudo[1934]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 06:05:23.588515 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 5 06:05:23.617223 augenrules[1957]: No rules Sep 5 06:05:23.618109 systemd[1]: audit-rules.service: Deactivated successfully. Sep 5 06:05:23.618314 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 5 06:05:23.619129 sudo[1934]: pam_unix(sudo:session): session closed for user root Sep 5 06:05:23.621593 sshd[1933]: Connection closed by 139.178.89.65 port 42684 Sep 5 06:05:23.620834 sshd-session[1930]: pam_unix(sshd:session): session closed for user core Sep 5 06:05:23.629242 systemd[1]: sshd@5-139.178.70.103:22-139.178.89.65:42684.service: Deactivated successfully. Sep 5 06:05:23.630210 systemd[1]: session-8.scope: Deactivated successfully. Sep 5 06:05:23.630659 systemd-logind[1625]: Session 8 logged out. Waiting for processes to exit. Sep 5 06:05:23.632205 systemd[1]: Started sshd@6-139.178.70.103:22-139.178.89.65:42696.service - OpenSSH per-connection server daemon (139.178.89.65:42696). Sep 5 06:05:23.633776 systemd-logind[1625]: Removed session 8. Sep 5 06:05:23.667694 sshd[1967]: Accepted publickey for core from 139.178.89.65 port 42696 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:05:23.668437 sshd-session[1967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:05:23.671015 systemd-logind[1625]: New session 9 of user core. Sep 5 06:05:23.685331 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 5 06:05:23.732975 sudo[1971]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 5 06:05:23.733126 sudo[1971]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 06:05:24.123954 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 5 06:05:24.133536 (dockerd)[1988]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 5 06:05:24.373793 dockerd[1988]: time="2025-09-05T06:05:24.373757524Z" level=info msg="Starting up" Sep 5 06:05:24.374443 dockerd[1988]: time="2025-09-05T06:05:24.374279940Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 5 06:05:24.383155 dockerd[1988]: time="2025-09-05T06:05:24.383109910Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 5 06:05:24.394053 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1779334817-merged.mount: Deactivated successfully. Sep 5 06:05:24.405456 dockerd[1988]: time="2025-09-05T06:05:24.405433086Z" level=info msg="Loading containers: start." Sep 5 06:05:24.415248 kernel: Initializing XFRM netlink socket Sep 5 06:05:24.584123 systemd-networkd[1316]: docker0: Link UP Sep 5 06:05:24.587152 dockerd[1988]: time="2025-09-05T06:05:24.587130025Z" level=info msg="Loading containers: done." Sep 5 06:05:24.595253 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3889220772-merged.mount: Deactivated successfully. Sep 5 06:05:24.597749 dockerd[1988]: time="2025-09-05T06:05:24.597545453Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 5 06:05:24.597749 dockerd[1988]: time="2025-09-05T06:05:24.597601640Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 5 06:05:24.597749 dockerd[1988]: time="2025-09-05T06:05:24.597648865Z" level=info msg="Initializing buildkit" Sep 5 06:05:24.610094 dockerd[1988]: time="2025-09-05T06:05:24.610073694Z" level=info msg="Completed buildkit initialization" Sep 5 06:05:24.614700 dockerd[1988]: time="2025-09-05T06:05:24.614674522Z" level=info msg="Daemon has completed initialization" Sep 5 06:05:24.615053 dockerd[1988]: time="2025-09-05T06:05:24.614798886Z" level=info msg="API listen on /run/docker.sock" Sep 5 06:05:24.614852 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 5 06:05:25.493459 containerd[1653]: time="2025-09-05T06:05:25.493237253Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 5 06:05:25.829719 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 5 06:05:25.830973 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:05:26.159966 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:26.167476 (kubelet)[2207]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:05:26.193480 kubelet[2207]: E0905 06:05:26.193455 2207 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:05:26.194843 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:05:26.194930 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:05:26.195233 systemd[1]: kubelet.service: Consumed 107ms CPU time, 107.6M memory peak. Sep 5 06:05:26.565897 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount662384839.mount: Deactivated successfully. Sep 5 06:05:27.972473 containerd[1653]: time="2025-09-05T06:05:27.972447066Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:27.973048 containerd[1653]: time="2025-09-05T06:05:27.973031616Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28800687" Sep 5 06:05:27.973578 containerd[1653]: time="2025-09-05T06:05:27.973536153Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:27.975345 containerd[1653]: time="2025-09-05T06:05:27.975332500Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:27.976495 containerd[1653]: time="2025-09-05T06:05:27.976262172Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 2.482999019s" Sep 5 06:05:27.976495 containerd[1653]: time="2025-09-05T06:05:27.976281359Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Sep 5 06:05:27.976615 containerd[1653]: time="2025-09-05T06:05:27.976597781Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 5 06:05:29.141773 containerd[1653]: time="2025-09-05T06:05:29.141659044Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:29.149084 containerd[1653]: time="2025-09-05T06:05:29.149064261Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24784128" Sep 5 06:05:29.163882 containerd[1653]: time="2025-09-05T06:05:29.163853504Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:29.182182 containerd[1653]: time="2025-09-05T06:05:29.181943454Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:29.182761 containerd[1653]: time="2025-09-05T06:05:29.182744110Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 1.206126942s" Sep 5 06:05:29.182825 containerd[1653]: time="2025-09-05T06:05:29.182814241Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Sep 5 06:05:29.183362 containerd[1653]: time="2025-09-05T06:05:29.183340730Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 5 06:05:30.998475 containerd[1653]: time="2025-09-05T06:05:30.998438096Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:30.999207 containerd[1653]: time="2025-09-05T06:05:30.999189411Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19175036" Sep 5 06:05:30.999568 containerd[1653]: time="2025-09-05T06:05:30.999553114Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:31.001462 containerd[1653]: time="2025-09-05T06:05:31.001443858Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:31.002424 containerd[1653]: time="2025-09-05T06:05:31.002389979Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 1.819028448s" Sep 5 06:05:31.002451 containerd[1653]: time="2025-09-05T06:05:31.002424434Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Sep 5 06:05:31.004971 containerd[1653]: time="2025-09-05T06:05:31.004941397Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 5 06:05:32.003960 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2226728462.mount: Deactivated successfully. Sep 5 06:05:32.357745 containerd[1653]: time="2025-09-05T06:05:32.357716625Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:32.361817 containerd[1653]: time="2025-09-05T06:05:32.361784550Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30897170" Sep 5 06:05:32.366131 containerd[1653]: time="2025-09-05T06:05:32.366102802Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:32.374196 containerd[1653]: time="2025-09-05T06:05:32.373970706Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:32.374353 containerd[1653]: time="2025-09-05T06:05:32.374339533Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 1.369235162s" Sep 5 06:05:32.374421 containerd[1653]: time="2025-09-05T06:05:32.374408295Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Sep 5 06:05:32.374829 containerd[1653]: time="2025-09-05T06:05:32.374806715Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 5 06:05:32.946249 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4252607096.mount: Deactivated successfully. Sep 5 06:05:34.256589 containerd[1653]: time="2025-09-05T06:05:34.256553423Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:34.261510 containerd[1653]: time="2025-09-05T06:05:34.261495110Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 5 06:05:34.267146 containerd[1653]: time="2025-09-05T06:05:34.267128347Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:34.275321 containerd[1653]: time="2025-09-05T06:05:34.275290051Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:34.276056 containerd[1653]: time="2025-09-05T06:05:34.276036792Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.901153084s" Sep 5 06:05:34.276126 containerd[1653]: time="2025-09-05T06:05:34.276115280Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 5 06:05:34.276723 containerd[1653]: time="2025-09-05T06:05:34.276451955Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 5 06:05:34.868929 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4024997380.mount: Deactivated successfully. Sep 5 06:05:34.903187 containerd[1653]: time="2025-09-05T06:05:34.903068688Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 06:05:34.910296 containerd[1653]: time="2025-09-05T06:05:34.910263046Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 5 06:05:34.915410 containerd[1653]: time="2025-09-05T06:05:34.915374055Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 06:05:34.926214 containerd[1653]: time="2025-09-05T06:05:34.926184660Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 06:05:34.926860 containerd[1653]: time="2025-09-05T06:05:34.926553935Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 650.082845ms" Sep 5 06:05:34.926860 containerd[1653]: time="2025-09-05T06:05:34.926576486Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 5 06:05:34.927114 containerd[1653]: time="2025-09-05T06:05:34.927094667Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 5 06:05:35.738616 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2051729901.mount: Deactivated successfully. Sep 5 06:05:36.329556 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 5 06:05:36.330926 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:05:36.758387 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:36.761844 (kubelet)[2396]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:05:36.812461 kubelet[2396]: E0905 06:05:36.812417 2396 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:05:36.813529 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:05:36.813610 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:05:36.814041 systemd[1]: kubelet.service: Consumed 107ms CPU time, 109.8M memory peak. Sep 5 06:05:37.573620 update_engine[1628]: I20250905 06:05:37.573202 1628 update_attempter.cc:509] Updating boot flags... Sep 5 06:05:39.807929 containerd[1653]: time="2025-09-05T06:05:39.807870944Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:39.821390 containerd[1653]: time="2025-09-05T06:05:39.821219135Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 5 06:05:39.845969 containerd[1653]: time="2025-09-05T06:05:39.845931773Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:39.865073 containerd[1653]: time="2025-09-05T06:05:39.865034940Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:05:39.865695 containerd[1653]: time="2025-09-05T06:05:39.865604927Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 4.938487665s" Sep 5 06:05:39.865695 containerd[1653]: time="2025-09-05T06:05:39.865638748Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 5 06:05:42.118053 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:42.118161 systemd[1]: kubelet.service: Consumed 107ms CPU time, 109.8M memory peak. Sep 5 06:05:42.119667 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:05:42.137466 systemd[1]: Reload requested from client PID 2460 ('systemctl') (unit session-9.scope)... Sep 5 06:05:42.137476 systemd[1]: Reloading... Sep 5 06:05:42.203196 zram_generator::config[2507]: No configuration found. Sep 5 06:05:42.279302 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 5 06:05:42.346850 systemd[1]: Reloading finished in 209 ms. Sep 5 06:05:42.376405 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 5 06:05:42.376458 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 5 06:05:42.376720 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:42.376754 systemd[1]: kubelet.service: Consumed 50ms CPU time, 74.4M memory peak. Sep 5 06:05:42.378299 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:05:42.638977 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:42.648421 (kubelet)[2571]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 5 06:05:42.690452 kubelet[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 06:05:42.690650 kubelet[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 5 06:05:42.690682 kubelet[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 06:05:42.690777 kubelet[2571]: I0905 06:05:42.690760 2571 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 5 06:05:43.024672 kubelet[2571]: I0905 06:05:43.024488 2571 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 5 06:05:43.024799 kubelet[2571]: I0905 06:05:43.024792 2571 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 5 06:05:43.024997 kubelet[2571]: I0905 06:05:43.024989 2571 server.go:954] "Client rotation is on, will bootstrap in background" Sep 5 06:05:43.055874 kubelet[2571]: I0905 06:05:43.055842 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 5 06:05:43.056589 kubelet[2571]: E0905 06:05:43.056553 2571 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.103:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:43.069629 kubelet[2571]: I0905 06:05:43.069610 2571 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 5 06:05:43.073961 kubelet[2571]: I0905 06:05:43.073940 2571 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 5 06:05:43.075619 kubelet[2571]: I0905 06:05:43.075595 2571 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 5 06:05:43.075775 kubelet[2571]: I0905 06:05:43.075669 2571 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 5 06:05:43.077784 kubelet[2571]: I0905 06:05:43.077771 2571 topology_manager.go:138] "Creating topology manager with none policy" Sep 5 06:05:43.077832 kubelet[2571]: I0905 06:05:43.077827 2571 container_manager_linux.go:304] "Creating device plugin manager" Sep 5 06:05:43.078806 kubelet[2571]: I0905 06:05:43.078797 2571 state_mem.go:36] "Initialized new in-memory state store" Sep 5 06:05:43.082582 kubelet[2571]: I0905 06:05:43.082565 2571 kubelet.go:446] "Attempting to sync node with API server" Sep 5 06:05:43.082758 kubelet[2571]: I0905 06:05:43.082750 2571 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 5 06:05:43.084192 kubelet[2571]: I0905 06:05:43.084183 2571 kubelet.go:352] "Adding apiserver pod source" Sep 5 06:05:43.084271 kubelet[2571]: I0905 06:05:43.084234 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 5 06:05:43.089028 kubelet[2571]: W0905 06:05:43.088713 2571 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Sep 5 06:05:43.089028 kubelet[2571]: E0905 06:05:43.088759 2571 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:43.089028 kubelet[2571]: W0905 06:05:43.088955 2571 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Sep 5 06:05:43.089028 kubelet[2571]: E0905 06:05:43.088974 2571 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:43.090336 kubelet[2571]: I0905 06:05:43.090230 2571 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 5 06:05:43.093265 kubelet[2571]: I0905 06:05:43.092805 2571 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 5 06:05:43.093265 kubelet[2571]: W0905 06:05:43.092851 2571 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 5 06:05:43.097573 kubelet[2571]: I0905 06:05:43.095982 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 5 06:05:43.097573 kubelet[2571]: I0905 06:05:43.096001 2571 server.go:1287] "Started kubelet" Sep 5 06:05:43.099266 kubelet[2571]: I0905 06:05:43.099248 2571 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 5 06:05:43.104142 kubelet[2571]: I0905 06:05:43.103783 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 5 06:05:43.104142 kubelet[2571]: I0905 06:05:43.104074 2571 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 5 06:05:43.104860 kubelet[2571]: I0905 06:05:43.104848 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 5 06:05:43.105214 kubelet[2571]: I0905 06:05:43.105202 2571 server.go:479] "Adding debug handlers to kubelet server" Sep 5 06:05:43.109580 kubelet[2571]: E0905 06:05:43.107393 2571 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.103:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.103:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18624dcbfa011af8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-05 06:05:43.095991032 +0000 UTC m=+0.445354111,LastTimestamp:2025-09-05 06:05:43.095991032 +0000 UTC m=+0.445354111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 5 06:05:43.109909 kubelet[2571]: I0905 06:05:43.109894 2571 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 5 06:05:43.112116 kubelet[2571]: E0905 06:05:43.111845 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:43.112116 kubelet[2571]: I0905 06:05:43.111880 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 5 06:05:43.112116 kubelet[2571]: I0905 06:05:43.112047 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 5 06:05:43.112116 kubelet[2571]: I0905 06:05:43.112101 2571 reconciler.go:26] "Reconciler: start to sync state" Sep 5 06:05:43.113194 kubelet[2571]: W0905 06:05:43.112376 2571 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Sep 5 06:05:43.113194 kubelet[2571]: E0905 06:05:43.112405 2571 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:43.113194 kubelet[2571]: E0905 06:05:43.112545 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="200ms" Sep 5 06:05:43.119912 kubelet[2571]: I0905 06:05:43.119766 2571 factory.go:221] Registration of the containerd container factory successfully Sep 5 06:05:43.119912 kubelet[2571]: I0905 06:05:43.119785 2571 factory.go:221] Registration of the systemd container factory successfully Sep 5 06:05:43.119912 kubelet[2571]: I0905 06:05:43.119849 2571 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 5 06:05:43.124584 kubelet[2571]: I0905 06:05:43.124548 2571 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 5 06:05:43.126722 kubelet[2571]: I0905 06:05:43.126698 2571 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 5 06:05:43.126722 kubelet[2571]: I0905 06:05:43.126722 2571 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 5 06:05:43.126804 kubelet[2571]: I0905 06:05:43.126741 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 5 06:05:43.126804 kubelet[2571]: I0905 06:05:43.126748 2571 kubelet.go:2382] "Starting kubelet main sync loop" Sep 5 06:05:43.126804 kubelet[2571]: E0905 06:05:43.126775 2571 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 5 06:05:43.131751 kubelet[2571]: W0905 06:05:43.131722 2571 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Sep 5 06:05:43.131816 kubelet[2571]: E0905 06:05:43.131756 2571 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:43.139180 kubelet[2571]: E0905 06:05:43.139153 2571 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 5 06:05:43.142291 kubelet[2571]: I0905 06:05:43.142275 2571 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 5 06:05:43.142291 kubelet[2571]: I0905 06:05:43.142286 2571 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 5 06:05:43.142377 kubelet[2571]: I0905 06:05:43.142296 2571 state_mem.go:36] "Initialized new in-memory state store" Sep 5 06:05:43.143326 kubelet[2571]: I0905 06:05:43.143314 2571 policy_none.go:49] "None policy: Start" Sep 5 06:05:43.143326 kubelet[2571]: I0905 06:05:43.143325 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 5 06:05:43.143379 kubelet[2571]: I0905 06:05:43.143331 2571 state_mem.go:35] "Initializing new in-memory state store" Sep 5 06:05:43.146775 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 5 06:05:43.153580 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 5 06:05:43.156787 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 5 06:05:43.167963 kubelet[2571]: I0905 06:05:43.167940 2571 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 5 06:05:43.169177 kubelet[2571]: I0905 06:05:43.169159 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 5 06:05:43.169237 kubelet[2571]: I0905 06:05:43.169219 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 5 06:05:43.169876 kubelet[2571]: I0905 06:05:43.169868 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 5 06:05:43.170090 kubelet[2571]: E0905 06:05:43.170048 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 5 06:05:43.170090 kubelet[2571]: E0905 06:05:43.170070 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 5 06:05:43.235446 systemd[1]: Created slice kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice - libcontainer container kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice. Sep 5 06:05:43.248625 kubelet[2571]: E0905 06:05:43.248608 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:43.250057 systemd[1]: Created slice kubepods-burstable-podc5c9ebd3eae07930df507ee500306d97.slice - libcontainer container kubepods-burstable-podc5c9ebd3eae07930df507ee500306d97.slice. Sep 5 06:05:43.260855 kubelet[2571]: E0905 06:05:43.260842 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:43.262800 systemd[1]: Created slice kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice - libcontainer container kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice. Sep 5 06:05:43.263723 kubelet[2571]: E0905 06:05:43.263712 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:43.270558 kubelet[2571]: I0905 06:05:43.270546 2571 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:05:43.270805 kubelet[2571]: E0905 06:05:43.270790 2571 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Sep 5 06:05:43.313298 kubelet[2571]: E0905 06:05:43.313226 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="400ms" Sep 5 06:05:43.313389 kubelet[2571]: I0905 06:05:43.313305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:43.313389 kubelet[2571]: I0905 06:05:43.313318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:43.313389 kubelet[2571]: I0905 06:05:43.313330 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:43.313389 kubelet[2571]: I0905 06:05:43.313342 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:43.313389 kubelet[2571]: I0905 06:05:43.313351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 5 06:05:43.313505 kubelet[2571]: I0905 06:05:43.313360 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5c9ebd3eae07930df507ee500306d97-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"c5c9ebd3eae07930df507ee500306d97\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:43.313505 kubelet[2571]: I0905 06:05:43.313367 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:43.313505 kubelet[2571]: I0905 06:05:43.313375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5c9ebd3eae07930df507ee500306d97-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"c5c9ebd3eae07930df507ee500306d97\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:43.313505 kubelet[2571]: I0905 06:05:43.313382 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5c9ebd3eae07930df507ee500306d97-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"c5c9ebd3eae07930df507ee500306d97\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:43.471982 kubelet[2571]: I0905 06:05:43.471961 2571 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:05:43.472229 kubelet[2571]: E0905 06:05:43.472208 2571 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Sep 5 06:05:43.550247 containerd[1653]: time="2025-09-05T06:05:43.550220191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,}" Sep 5 06:05:43.576303 containerd[1653]: time="2025-09-05T06:05:43.576103645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:c5c9ebd3eae07930df507ee500306d97,Namespace:kube-system,Attempt:0,}" Sep 5 06:05:43.576303 containerd[1653]: time="2025-09-05T06:05:43.576236931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,}" Sep 5 06:05:43.713620 kubelet[2571]: E0905 06:05:43.713592 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="800ms" Sep 5 06:05:43.759959 containerd[1653]: time="2025-09-05T06:05:43.759571832Z" level=info msg="connecting to shim 4f340e9870854ef728deca28ea4d0951ee614bd51c9bdbbf6aeacd316ae61d1c" address="unix:///run/containerd/s/3780de80dd667679e368d96237b9e32d8b2f8a230b79539d45c8d7d8a703bfff" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:05:43.760326 containerd[1653]: time="2025-09-05T06:05:43.760308293Z" level=info msg="connecting to shim 5ec2ef67a1e4e8b934825a97c6cad3c2ff951f17d3bfe2e8efc61fd8d8d37b40" address="unix:///run/containerd/s/347fdbbf3f95dfbfe9a6ef3b87018aa2d2555b087e1401659656492f56d20c90" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:05:43.787187 containerd[1653]: time="2025-09-05T06:05:43.787145005Z" level=info msg="connecting to shim 7575237c64c103fb35fb4a4d237914ebbce4f3a15f64f23f7e8bb1f2089d858e" address="unix:///run/containerd/s/9fdbd01a53dfd35fa23b0b5fd7ce2c02a628f0fe21d4374e10830edcfa61bbd9" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:05:43.873648 kubelet[2571]: I0905 06:05:43.873634 2571 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:05:43.873943 kubelet[2571]: E0905 06:05:43.873928 2571 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Sep 5 06:05:43.903334 systemd[1]: Started cri-containerd-4f340e9870854ef728deca28ea4d0951ee614bd51c9bdbbf6aeacd316ae61d1c.scope - libcontainer container 4f340e9870854ef728deca28ea4d0951ee614bd51c9bdbbf6aeacd316ae61d1c. Sep 5 06:05:43.904351 systemd[1]: Started cri-containerd-5ec2ef67a1e4e8b934825a97c6cad3c2ff951f17d3bfe2e8efc61fd8d8d37b40.scope - libcontainer container 5ec2ef67a1e4e8b934825a97c6cad3c2ff951f17d3bfe2e8efc61fd8d8d37b40. Sep 5 06:05:43.906015 systemd[1]: Started cri-containerd-7575237c64c103fb35fb4a4d237914ebbce4f3a15f64f23f7e8bb1f2089d858e.scope - libcontainer container 7575237c64c103fb35fb4a4d237914ebbce4f3a15f64f23f7e8bb1f2089d858e. Sep 5 06:05:43.938551 kubelet[2571]: W0905 06:05:43.938512 2571 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Sep 5 06:05:43.938634 kubelet[2571]: E0905 06:05:43.938554 2571 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:43.968966 containerd[1653]: time="2025-09-05T06:05:43.968873070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:c5c9ebd3eae07930df507ee500306d97,Namespace:kube-system,Attempt:0,} returns sandbox id \"4f340e9870854ef728deca28ea4d0951ee614bd51c9bdbbf6aeacd316ae61d1c\"" Sep 5 06:05:43.972422 containerd[1653]: time="2025-09-05T06:05:43.972400918Z" level=info msg="CreateContainer within sandbox \"4f340e9870854ef728deca28ea4d0951ee614bd51c9bdbbf6aeacd316ae61d1c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 5 06:05:43.992394 containerd[1653]: time="2025-09-05T06:05:43.992368432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,} returns sandbox id \"5ec2ef67a1e4e8b934825a97c6cad3c2ff951f17d3bfe2e8efc61fd8d8d37b40\"" Sep 5 06:05:43.993984 containerd[1653]: time="2025-09-05T06:05:43.993964564Z" level=info msg="CreateContainer within sandbox \"5ec2ef67a1e4e8b934825a97c6cad3c2ff951f17d3bfe2e8efc61fd8d8d37b40\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 5 06:05:44.011995 containerd[1653]: time="2025-09-05T06:05:44.011967288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,} returns sandbox id \"7575237c64c103fb35fb4a4d237914ebbce4f3a15f64f23f7e8bb1f2089d858e\"" Sep 5 06:05:44.013320 containerd[1653]: time="2025-09-05T06:05:44.013275233Z" level=info msg="Container b111191bd180bc576d15c1416fca59d9ed0db92cb281a12e5c1471ed58ee27d2: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:05:44.013451 containerd[1653]: time="2025-09-05T06:05:44.013439994Z" level=info msg="CreateContainer within sandbox \"7575237c64c103fb35fb4a4d237914ebbce4f3a15f64f23f7e8bb1f2089d858e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 5 06:05:44.054258 containerd[1653]: time="2025-09-05T06:05:44.054220673Z" level=info msg="Container a111408cfd2eac443eefc4f167c7442e109f702ae487b779acd76686d68d6318: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:05:44.078697 containerd[1653]: time="2025-09-05T06:05:44.078577341Z" level=info msg="Container 92505dc261519c4083b1a36f1be9ac4aef9fe58caf438d237e21ca7f51707c03: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:05:44.079423 containerd[1653]: time="2025-09-05T06:05:44.079405767Z" level=info msg="CreateContainer within sandbox \"4f340e9870854ef728deca28ea4d0951ee614bd51c9bdbbf6aeacd316ae61d1c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b111191bd180bc576d15c1416fca59d9ed0db92cb281a12e5c1471ed58ee27d2\"" Sep 5 06:05:44.080389 containerd[1653]: time="2025-09-05T06:05:44.080375206Z" level=info msg="StartContainer for \"b111191bd180bc576d15c1416fca59d9ed0db92cb281a12e5c1471ed58ee27d2\"" Sep 5 06:05:44.080928 containerd[1653]: time="2025-09-05T06:05:44.080907385Z" level=info msg="connecting to shim b111191bd180bc576d15c1416fca59d9ed0db92cb281a12e5c1471ed58ee27d2" address="unix:///run/containerd/s/3780de80dd667679e368d96237b9e32d8b2f8a230b79539d45c8d7d8a703bfff" protocol=ttrpc version=3 Sep 5 06:05:44.096273 systemd[1]: Started cri-containerd-b111191bd180bc576d15c1416fca59d9ed0db92cb281a12e5c1471ed58ee27d2.scope - libcontainer container b111191bd180bc576d15c1416fca59d9ed0db92cb281a12e5c1471ed58ee27d2. Sep 5 06:05:44.114706 containerd[1653]: time="2025-09-05T06:05:44.114682954Z" level=info msg="CreateContainer within sandbox \"7575237c64c103fb35fb4a4d237914ebbce4f3a15f64f23f7e8bb1f2089d858e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"92505dc261519c4083b1a36f1be9ac4aef9fe58caf438d237e21ca7f51707c03\"" Sep 5 06:05:44.114987 containerd[1653]: time="2025-09-05T06:05:44.114973879Z" level=info msg="StartContainer for \"92505dc261519c4083b1a36f1be9ac4aef9fe58caf438d237e21ca7f51707c03\"" Sep 5 06:05:44.115510 containerd[1653]: time="2025-09-05T06:05:44.115496592Z" level=info msg="connecting to shim 92505dc261519c4083b1a36f1be9ac4aef9fe58caf438d237e21ca7f51707c03" address="unix:///run/containerd/s/9fdbd01a53dfd35fa23b0b5fd7ce2c02a628f0fe21d4374e10830edcfa61bbd9" protocol=ttrpc version=3 Sep 5 06:05:44.127186 containerd[1653]: time="2025-09-05T06:05:44.125811938Z" level=info msg="CreateContainer within sandbox \"5ec2ef67a1e4e8b934825a97c6cad3c2ff951f17d3bfe2e8efc61fd8d8d37b40\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a111408cfd2eac443eefc4f167c7442e109f702ae487b779acd76686d68d6318\"" Sep 5 06:05:44.127519 containerd[1653]: time="2025-09-05T06:05:44.127499003Z" level=info msg="StartContainer for \"a111408cfd2eac443eefc4f167c7442e109f702ae487b779acd76686d68d6318\"" Sep 5 06:05:44.128783 containerd[1653]: time="2025-09-05T06:05:44.128764631Z" level=info msg="connecting to shim a111408cfd2eac443eefc4f167c7442e109f702ae487b779acd76686d68d6318" address="unix:///run/containerd/s/347fdbbf3f95dfbfe9a6ef3b87018aa2d2555b087e1401659656492f56d20c90" protocol=ttrpc version=3 Sep 5 06:05:44.137289 systemd[1]: Started cri-containerd-92505dc261519c4083b1a36f1be9ac4aef9fe58caf438d237e21ca7f51707c03.scope - libcontainer container 92505dc261519c4083b1a36f1be9ac4aef9fe58caf438d237e21ca7f51707c03. Sep 5 06:05:44.154262 systemd[1]: Started cri-containerd-a111408cfd2eac443eefc4f167c7442e109f702ae487b779acd76686d68d6318.scope - libcontainer container a111408cfd2eac443eefc4f167c7442e109f702ae487b779acd76686d68d6318. Sep 5 06:05:44.159595 containerd[1653]: time="2025-09-05T06:05:44.159477519Z" level=info msg="StartContainer for \"b111191bd180bc576d15c1416fca59d9ed0db92cb281a12e5c1471ed58ee27d2\" returns successfully" Sep 5 06:05:44.208624 containerd[1653]: time="2025-09-05T06:05:44.208604742Z" level=info msg="StartContainer for \"a111408cfd2eac443eefc4f167c7442e109f702ae487b779acd76686d68d6318\" returns successfully" Sep 5 06:05:44.210052 containerd[1653]: time="2025-09-05T06:05:44.209724479Z" level=info msg="StartContainer for \"92505dc261519c4083b1a36f1be9ac4aef9fe58caf438d237e21ca7f51707c03\" returns successfully" Sep 5 06:05:44.283879 kubelet[2571]: W0905 06:05:44.283820 2571 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Sep 5 06:05:44.283879 kubelet[2571]: E0905 06:05:44.283862 2571 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:44.469024 kubelet[2571]: W0905 06:05:44.468924 2571 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Sep 5 06:05:44.469024 kubelet[2571]: E0905 06:05:44.468967 2571 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:44.507483 kubelet[2571]: W0905 06:05:44.507419 2571 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Sep 5 06:05:44.507483 kubelet[2571]: E0905 06:05:44.507467 2571 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Sep 5 06:05:44.513991 kubelet[2571]: E0905 06:05:44.513972 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="1.6s" Sep 5 06:05:44.675593 kubelet[2571]: I0905 06:05:44.675563 2571 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:05:44.675771 kubelet[2571]: E0905 06:05:44.675758 2571 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Sep 5 06:05:45.165489 kubelet[2571]: E0905 06:05:45.165436 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:45.166733 kubelet[2571]: E0905 06:05:45.166665 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:45.167759 kubelet[2571]: E0905 06:05:45.167744 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:46.116303 kubelet[2571]: E0905 06:05:46.116274 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 5 06:05:46.169834 kubelet[2571]: E0905 06:05:46.169768 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:46.170740 kubelet[2571]: E0905 06:05:46.170732 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:46.170913 kubelet[2571]: E0905 06:05:46.170841 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:46.253418 kubelet[2571]: E0905 06:05:46.253400 2571 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Sep 5 06:05:46.277504 kubelet[2571]: I0905 06:05:46.277458 2571 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:05:46.289452 kubelet[2571]: I0905 06:05:46.289389 2571 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 5 06:05:46.289452 kubelet[2571]: E0905 06:05:46.289408 2571 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 5 06:05:46.302067 kubelet[2571]: E0905 06:05:46.302047 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:46.402530 kubelet[2571]: E0905 06:05:46.402450 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:46.502938 kubelet[2571]: E0905 06:05:46.502906 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:46.603743 kubelet[2571]: E0905 06:05:46.603713 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:46.704273 kubelet[2571]: E0905 06:05:46.704162 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:46.804887 kubelet[2571]: E0905 06:05:46.804857 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:46.905563 kubelet[2571]: E0905 06:05:46.905499 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:47.006235 kubelet[2571]: E0905 06:05:47.006117 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:47.106242 kubelet[2571]: E0905 06:05:47.106208 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:47.172584 kubelet[2571]: E0905 06:05:47.172555 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:47.173095 kubelet[2571]: E0905 06:05:47.172786 2571 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:05:47.207000 kubelet[2571]: E0905 06:05:47.206969 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:47.307653 kubelet[2571]: E0905 06:05:47.307565 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:47.408071 kubelet[2571]: E0905 06:05:47.408037 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:47.508640 kubelet[2571]: E0905 06:05:47.508606 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:47.609647 kubelet[2571]: E0905 06:05:47.609615 2571 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:47.712488 kubelet[2571]: I0905 06:05:47.712460 2571 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 5 06:05:47.735011 kubelet[2571]: I0905 06:05:47.734978 2571 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:47.739069 kubelet[2571]: I0905 06:05:47.737839 2571 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:47.864562 systemd[1]: Reload requested from client PID 2840 ('systemctl') (unit session-9.scope)... Sep 5 06:05:47.864575 systemd[1]: Reloading... Sep 5 06:05:47.928233 zram_generator::config[2887]: No configuration found. Sep 5 06:05:48.015703 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 5 06:05:48.090667 kubelet[2571]: I0905 06:05:48.090462 2571 apiserver.go:52] "Watching apiserver" Sep 5 06:05:48.099596 systemd[1]: Reloading finished in 234 ms. Sep 5 06:05:48.115882 kubelet[2571]: I0905 06:05:48.115483 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 5 06:05:48.122625 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:05:48.136906 systemd[1]: kubelet.service: Deactivated successfully. Sep 5 06:05:48.137102 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:48.137148 systemd[1]: kubelet.service: Consumed 573ms CPU time, 129.7M memory peak. Sep 5 06:05:48.138876 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:05:48.461328 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:05:48.471493 (kubelet)[2951]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 5 06:05:48.551599 kubelet[2951]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 06:05:48.551809 kubelet[2951]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 5 06:05:48.551843 kubelet[2951]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 06:05:48.551924 kubelet[2951]: I0905 06:05:48.551908 2951 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 5 06:05:48.568112 kubelet[2951]: I0905 06:05:48.568078 2951 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 5 06:05:48.568242 kubelet[2951]: I0905 06:05:48.568235 2951 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 5 06:05:48.568781 kubelet[2951]: I0905 06:05:48.568761 2951 server.go:954] "Client rotation is on, will bootstrap in background" Sep 5 06:05:48.570678 kubelet[2951]: I0905 06:05:48.570654 2951 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 5 06:05:48.574828 kubelet[2951]: I0905 06:05:48.574417 2951 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 5 06:05:48.580636 kubelet[2951]: I0905 06:05:48.579960 2951 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 5 06:05:48.583215 kubelet[2951]: I0905 06:05:48.583194 2951 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 5 06:05:48.583612 kubelet[2951]: I0905 06:05:48.583575 2951 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 5 06:05:48.583869 kubelet[2951]: I0905 06:05:48.583668 2951 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 5 06:05:48.583985 kubelet[2951]: I0905 06:05:48.583976 2951 topology_manager.go:138] "Creating topology manager with none policy" Sep 5 06:05:48.584046 kubelet[2951]: I0905 06:05:48.584040 2951 container_manager_linux.go:304] "Creating device plugin manager" Sep 5 06:05:48.584128 kubelet[2951]: I0905 06:05:48.584120 2951 state_mem.go:36] "Initialized new in-memory state store" Sep 5 06:05:48.584429 kubelet[2951]: I0905 06:05:48.584420 2951 kubelet.go:446] "Attempting to sync node with API server" Sep 5 06:05:48.584657 kubelet[2951]: I0905 06:05:48.584648 2951 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 5 06:05:48.584733 kubelet[2951]: I0905 06:05:48.584727 2951 kubelet.go:352] "Adding apiserver pod source" Sep 5 06:05:48.584770 kubelet[2951]: I0905 06:05:48.584764 2951 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 5 06:05:48.589499 kubelet[2951]: I0905 06:05:48.589482 2951 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 5 06:05:48.590464 kubelet[2951]: I0905 06:05:48.590450 2951 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 5 06:05:48.591577 kubelet[2951]: I0905 06:05:48.591557 2951 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 5 06:05:48.592284 kubelet[2951]: I0905 06:05:48.591696 2951 server.go:1287] "Started kubelet" Sep 5 06:05:48.599665 kubelet[2951]: I0905 06:05:48.599642 2951 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 5 06:05:48.605027 kubelet[2951]: I0905 06:05:48.604991 2951 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 5 06:05:48.605976 kubelet[2951]: I0905 06:05:48.605856 2951 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 5 06:05:48.606158 kubelet[2951]: I0905 06:05:48.606146 2951 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 5 06:05:48.606877 kubelet[2951]: I0905 06:05:48.606750 2951 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 5 06:05:48.611779 kubelet[2951]: I0905 06:05:48.608223 2951 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 5 06:05:48.612517 kubelet[2951]: I0905 06:05:48.608233 2951 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 5 06:05:48.612754 kubelet[2951]: E0905 06:05:48.608361 2951 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:05:48.612879 kubelet[2951]: I0905 06:05:48.610977 2951 server.go:479] "Adding debug handlers to kubelet server" Sep 5 06:05:48.614050 kubelet[2951]: I0905 06:05:48.614033 2951 reconciler.go:26] "Reconciler: start to sync state" Sep 5 06:05:48.617649 kubelet[2951]: I0905 06:05:48.617623 2951 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 5 06:05:48.624061 kubelet[2951]: I0905 06:05:48.624021 2951 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 5 06:05:48.628152 kubelet[2951]: I0905 06:05:48.625209 2951 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 5 06:05:48.628152 kubelet[2951]: I0905 06:05:48.625234 2951 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 5 06:05:48.628152 kubelet[2951]: I0905 06:05:48.625250 2951 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 5 06:05:48.628152 kubelet[2951]: I0905 06:05:48.625255 2951 kubelet.go:2382] "Starting kubelet main sync loop" Sep 5 06:05:48.628152 kubelet[2951]: E0905 06:05:48.625286 2951 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 5 06:05:48.632591 kubelet[2951]: I0905 06:05:48.632295 2951 factory.go:221] Registration of the containerd container factory successfully Sep 5 06:05:48.632591 kubelet[2951]: I0905 06:05:48.632308 2951 factory.go:221] Registration of the systemd container factory successfully Sep 5 06:05:48.636477 kubelet[2951]: E0905 06:05:48.636450 2951 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681700 2951 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681712 2951 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681726 2951 state_mem.go:36] "Initialized new in-memory state store" Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681843 2951 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681851 2951 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681865 2951 policy_none.go:49] "None policy: Start" Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681874 2951 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681885 2951 state_mem.go:35] "Initializing new in-memory state store" Sep 5 06:05:48.682036 kubelet[2951]: I0905 06:05:48.681977 2951 state_mem.go:75] "Updated machine memory state" Sep 5 06:05:48.685569 kubelet[2951]: I0905 06:05:48.685546 2951 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 5 06:05:48.685729 kubelet[2951]: I0905 06:05:48.685718 2951 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 5 06:05:48.685761 kubelet[2951]: I0905 06:05:48.685728 2951 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 5 06:05:48.685949 kubelet[2951]: I0905 06:05:48.685929 2951 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 5 06:05:48.686879 kubelet[2951]: E0905 06:05:48.686863 2951 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 5 06:05:48.729338 kubelet[2951]: I0905 06:05:48.727999 2951 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 5 06:05:48.729338 kubelet[2951]: I0905 06:05:48.728113 2951 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:48.729338 kubelet[2951]: I0905 06:05:48.728007 2951 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:48.732576 kubelet[2951]: E0905 06:05:48.732468 2951 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 5 06:05:48.732897 kubelet[2951]: E0905 06:05:48.732871 2951 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:48.734071 kubelet[2951]: E0905 06:05:48.734027 2951 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:48.787822 kubelet[2951]: I0905 06:05:48.787321 2951 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:05:48.799945 kubelet[2951]: I0905 06:05:48.799902 2951 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 5 06:05:48.800053 kubelet[2951]: I0905 06:05:48.799986 2951 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 5 06:05:48.815342 kubelet[2951]: I0905 06:05:48.815310 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5c9ebd3eae07930df507ee500306d97-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"c5c9ebd3eae07930df507ee500306d97\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:48.815342 kubelet[2951]: I0905 06:05:48.815348 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5c9ebd3eae07930df507ee500306d97-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"c5c9ebd3eae07930df507ee500306d97\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:48.815459 kubelet[2951]: I0905 06:05:48.815361 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:48.815459 kubelet[2951]: I0905 06:05:48.815371 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:48.815459 kubelet[2951]: I0905 06:05:48.815380 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:48.815459 kubelet[2951]: I0905 06:05:48.815391 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:48.815459 kubelet[2951]: I0905 06:05:48.815401 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 5 06:05:48.815544 kubelet[2951]: I0905 06:05:48.815418 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5c9ebd3eae07930df507ee500306d97-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"c5c9ebd3eae07930df507ee500306d97\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:48.815544 kubelet[2951]: I0905 06:05:48.815429 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:05:49.589271 kubelet[2951]: I0905 06:05:49.589210 2951 apiserver.go:52] "Watching apiserver" Sep 5 06:05:49.614180 kubelet[2951]: I0905 06:05:49.613212 2951 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 5 06:05:49.665628 kubelet[2951]: I0905 06:05:49.665597 2951 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:49.666335 kubelet[2951]: I0905 06:05:49.665826 2951 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 5 06:05:49.675188 kubelet[2951]: E0905 06:05:49.674895 2951 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 5 06:05:49.675188 kubelet[2951]: E0905 06:05:49.675080 2951 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 5 06:05:49.698665 kubelet[2951]: I0905 06:05:49.698600 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.6985844119999998 podStartE2EDuration="2.698584412s" podCreationTimestamp="2025-09-05 06:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:05:49.693516685 +0000 UTC m=+1.207460255" watchObservedRunningTime="2025-09-05 06:05:49.698584412 +0000 UTC m=+1.212527983" Sep 5 06:05:49.705486 kubelet[2951]: I0905 06:05:49.705419 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.705404892 podStartE2EDuration="2.705404892s" podCreationTimestamp="2025-09-05 06:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:05:49.698838155 +0000 UTC m=+1.212781726" watchObservedRunningTime="2025-09-05 06:05:49.705404892 +0000 UTC m=+1.219348465" Sep 5 06:05:54.721702 kubelet[2951]: I0905 06:05:54.721679 2951 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 5 06:05:54.722662 containerd[1653]: time="2025-09-05T06:05:54.722185588Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 5 06:05:54.722860 kubelet[2951]: I0905 06:05:54.722330 2951 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 5 06:05:55.569106 kubelet[2951]: I0905 06:05:55.568920 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=8.568909599 podStartE2EDuration="8.568909599s" podCreationTimestamp="2025-09-05 06:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:05:49.70584427 +0000 UTC m=+1.219787847" watchObservedRunningTime="2025-09-05 06:05:55.568909599 +0000 UTC m=+7.082853176" Sep 5 06:05:55.578321 systemd[1]: Created slice kubepods-besteffort-pod9b6b4c12_69b0_4f52_a4f1_6d71083edf87.slice - libcontainer container kubepods-besteffort-pod9b6b4c12_69b0_4f52_a4f1_6d71083edf87.slice. Sep 5 06:05:55.660232 kubelet[2951]: I0905 06:05:55.660069 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/9b6b4c12-69b0-4f52-a4f1-6d71083edf87-kube-proxy\") pod \"kube-proxy-qpj6f\" (UID: \"9b6b4c12-69b0-4f52-a4f1-6d71083edf87\") " pod="kube-system/kube-proxy-qpj6f" Sep 5 06:05:55.660232 kubelet[2951]: I0905 06:05:55.660099 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9b6b4c12-69b0-4f52-a4f1-6d71083edf87-xtables-lock\") pod \"kube-proxy-qpj6f\" (UID: \"9b6b4c12-69b0-4f52-a4f1-6d71083edf87\") " pod="kube-system/kube-proxy-qpj6f" Sep 5 06:05:55.660232 kubelet[2951]: I0905 06:05:55.660115 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b6b4c12-69b0-4f52-a4f1-6d71083edf87-lib-modules\") pod \"kube-proxy-qpj6f\" (UID: \"9b6b4c12-69b0-4f52-a4f1-6d71083edf87\") " pod="kube-system/kube-proxy-qpj6f" Sep 5 06:05:55.660232 kubelet[2951]: I0905 06:05:55.660130 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgngr\" (UniqueName: \"kubernetes.io/projected/9b6b4c12-69b0-4f52-a4f1-6d71083edf87-kube-api-access-wgngr\") pod \"kube-proxy-qpj6f\" (UID: \"9b6b4c12-69b0-4f52-a4f1-6d71083edf87\") " pod="kube-system/kube-proxy-qpj6f" Sep 5 06:05:55.843513 systemd[1]: Created slice kubepods-besteffort-pod0f77afc6_f710_4d95_886d_1b599b219202.slice - libcontainer container kubepods-besteffort-pod0f77afc6_f710_4d95_886d_1b599b219202.slice. Sep 5 06:05:55.861341 kubelet[2951]: I0905 06:05:55.861310 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0f77afc6-f710-4d95-886d-1b599b219202-var-lib-calico\") pod \"tigera-operator-755d956888-vvwk7\" (UID: \"0f77afc6-f710-4d95-886d-1b599b219202\") " pod="tigera-operator/tigera-operator-755d956888-vvwk7" Sep 5 06:05:55.861341 kubelet[2951]: I0905 06:05:55.861343 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpfzs\" (UniqueName: \"kubernetes.io/projected/0f77afc6-f710-4d95-886d-1b599b219202-kube-api-access-qpfzs\") pod \"tigera-operator-755d956888-vvwk7\" (UID: \"0f77afc6-f710-4d95-886d-1b599b219202\") " pod="tigera-operator/tigera-operator-755d956888-vvwk7" Sep 5 06:05:55.892946 containerd[1653]: time="2025-09-05T06:05:55.892913111Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qpj6f,Uid:9b6b4c12-69b0-4f52-a4f1-6d71083edf87,Namespace:kube-system,Attempt:0,}" Sep 5 06:05:55.904290 containerd[1653]: time="2025-09-05T06:05:55.904253865Z" level=info msg="connecting to shim 2bfeadbf4a652d8bd4e051f9d2fe5c9ef70a32557a8a358557fc8dd6386fd29b" address="unix:///run/containerd/s/2f4314a034e26b5e22f2d786f8eca1cf0fb29689d81f11fe43a2edca837faea0" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:05:55.927318 systemd[1]: Started cri-containerd-2bfeadbf4a652d8bd4e051f9d2fe5c9ef70a32557a8a358557fc8dd6386fd29b.scope - libcontainer container 2bfeadbf4a652d8bd4e051f9d2fe5c9ef70a32557a8a358557fc8dd6386fd29b. Sep 5 06:05:55.944071 containerd[1653]: time="2025-09-05T06:05:55.944050907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qpj6f,Uid:9b6b4c12-69b0-4f52-a4f1-6d71083edf87,Namespace:kube-system,Attempt:0,} returns sandbox id \"2bfeadbf4a652d8bd4e051f9d2fe5c9ef70a32557a8a358557fc8dd6386fd29b\"" Sep 5 06:05:55.946179 containerd[1653]: time="2025-09-05T06:05:55.946145399Z" level=info msg="CreateContainer within sandbox \"2bfeadbf4a652d8bd4e051f9d2fe5c9ef70a32557a8a358557fc8dd6386fd29b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 5 06:05:55.955343 containerd[1653]: time="2025-09-05T06:05:55.954023735Z" level=info msg="Container 9b36e729208e6555e6038d58a7e28687bc794c411b15560cfced56927e64f5f2: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:05:55.955936 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4168689816.mount: Deactivated successfully. Sep 5 06:05:55.957955 containerd[1653]: time="2025-09-05T06:05:55.957904881Z" level=info msg="CreateContainer within sandbox \"2bfeadbf4a652d8bd4e051f9d2fe5c9ef70a32557a8a358557fc8dd6386fd29b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9b36e729208e6555e6038d58a7e28687bc794c411b15560cfced56927e64f5f2\"" Sep 5 06:05:55.959103 containerd[1653]: time="2025-09-05T06:05:55.958379034Z" level=info msg="StartContainer for \"9b36e729208e6555e6038d58a7e28687bc794c411b15560cfced56927e64f5f2\"" Sep 5 06:05:55.959365 containerd[1653]: time="2025-09-05T06:05:55.959339485Z" level=info msg="connecting to shim 9b36e729208e6555e6038d58a7e28687bc794c411b15560cfced56927e64f5f2" address="unix:///run/containerd/s/2f4314a034e26b5e22f2d786f8eca1cf0fb29689d81f11fe43a2edca837faea0" protocol=ttrpc version=3 Sep 5 06:05:55.973303 systemd[1]: Started cri-containerd-9b36e729208e6555e6038d58a7e28687bc794c411b15560cfced56927e64f5f2.scope - libcontainer container 9b36e729208e6555e6038d58a7e28687bc794c411b15560cfced56927e64f5f2. Sep 5 06:05:56.002505 containerd[1653]: time="2025-09-05T06:05:56.002477987Z" level=info msg="StartContainer for \"9b36e729208e6555e6038d58a7e28687bc794c411b15560cfced56927e64f5f2\" returns successfully" Sep 5 06:05:56.146896 containerd[1653]: time="2025-09-05T06:05:56.146820812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-vvwk7,Uid:0f77afc6-f710-4d95-886d-1b599b219202,Namespace:tigera-operator,Attempt:0,}" Sep 5 06:05:56.162314 containerd[1653]: time="2025-09-05T06:05:56.162277337Z" level=info msg="connecting to shim 3ebd8434bd2f3ab6aa5ec1044911ac50ff0eeffa5e04b03640999ca0817aeb54" address="unix:///run/containerd/s/7c3a7d19985eb267bc69f7c4bbd64a02084f0aa44187e04d14b791669f153de4" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:05:56.193327 systemd[1]: Started cri-containerd-3ebd8434bd2f3ab6aa5ec1044911ac50ff0eeffa5e04b03640999ca0817aeb54.scope - libcontainer container 3ebd8434bd2f3ab6aa5ec1044911ac50ff0eeffa5e04b03640999ca0817aeb54. Sep 5 06:05:56.239221 containerd[1653]: time="2025-09-05T06:05:56.239189828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-vvwk7,Uid:0f77afc6-f710-4d95-886d-1b599b219202,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3ebd8434bd2f3ab6aa5ec1044911ac50ff0eeffa5e04b03640999ca0817aeb54\"" Sep 5 06:05:56.241667 containerd[1653]: time="2025-09-05T06:05:56.241644143Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 5 06:05:56.770008 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3095692141.mount: Deactivated successfully. Sep 5 06:05:57.023456 kubelet[2951]: I0905 06:05:57.023304 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qpj6f" podStartSLOduration=2.02329339 podStartE2EDuration="2.02329339s" podCreationTimestamp="2025-09-05 06:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:05:56.683274815 +0000 UTC m=+8.197218384" watchObservedRunningTime="2025-09-05 06:05:57.02329339 +0000 UTC m=+8.537236961" Sep 5 06:05:58.081840 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount207428086.mount: Deactivated successfully. Sep 5 06:06:01.277077 containerd[1653]: time="2025-09-05T06:06:01.277043064Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:01.278152 containerd[1653]: time="2025-09-05T06:06:01.278131365Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 5 06:06:01.278431 containerd[1653]: time="2025-09-05T06:06:01.278406306Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:01.279844 containerd[1653]: time="2025-09-05T06:06:01.279822205Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:01.280212 containerd[1653]: time="2025-09-05T06:06:01.280194132Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 5.038528306s" Sep 5 06:06:01.280252 containerd[1653]: time="2025-09-05T06:06:01.280213708Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 5 06:06:01.282064 containerd[1653]: time="2025-09-05T06:06:01.282046475Z" level=info msg="CreateContainer within sandbox \"3ebd8434bd2f3ab6aa5ec1044911ac50ff0eeffa5e04b03640999ca0817aeb54\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 5 06:06:01.288872 containerd[1653]: time="2025-09-05T06:06:01.288841299Z" level=info msg="Container 05f0a2b8563d3214c0d49a8a1ebf134b3e073b4d06975a5794ece20be23fb7a0: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:01.292247 containerd[1653]: time="2025-09-05T06:06:01.292221359Z" level=info msg="CreateContainer within sandbox \"3ebd8434bd2f3ab6aa5ec1044911ac50ff0eeffa5e04b03640999ca0817aeb54\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"05f0a2b8563d3214c0d49a8a1ebf134b3e073b4d06975a5794ece20be23fb7a0\"" Sep 5 06:06:01.293316 containerd[1653]: time="2025-09-05T06:06:01.292591071Z" level=info msg="StartContainer for \"05f0a2b8563d3214c0d49a8a1ebf134b3e073b4d06975a5794ece20be23fb7a0\"" Sep 5 06:06:01.293316 containerd[1653]: time="2025-09-05T06:06:01.293269697Z" level=info msg="connecting to shim 05f0a2b8563d3214c0d49a8a1ebf134b3e073b4d06975a5794ece20be23fb7a0" address="unix:///run/containerd/s/7c3a7d19985eb267bc69f7c4bbd64a02084f0aa44187e04d14b791669f153de4" protocol=ttrpc version=3 Sep 5 06:06:01.320341 systemd[1]: Started cri-containerd-05f0a2b8563d3214c0d49a8a1ebf134b3e073b4d06975a5794ece20be23fb7a0.scope - libcontainer container 05f0a2b8563d3214c0d49a8a1ebf134b3e073b4d06975a5794ece20be23fb7a0. Sep 5 06:06:01.347568 containerd[1653]: time="2025-09-05T06:06:01.347548101Z" level=info msg="StartContainer for \"05f0a2b8563d3214c0d49a8a1ebf134b3e073b4d06975a5794ece20be23fb7a0\" returns successfully" Sep 5 06:06:01.701950 kubelet[2951]: I0905 06:06:01.701725 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-vvwk7" podStartSLOduration=1.656279507 podStartE2EDuration="6.696907436s" podCreationTimestamp="2025-09-05 06:05:55 +0000 UTC" firstStartedPulling="2025-09-05 06:05:56.240152557 +0000 UTC m=+7.754096134" lastFinishedPulling="2025-09-05 06:06:01.280780496 +0000 UTC m=+12.794724063" observedRunningTime="2025-09-05 06:06:01.696329696 +0000 UTC m=+13.210273273" watchObservedRunningTime="2025-09-05 06:06:01.696907436 +0000 UTC m=+13.210851015" Sep 5 06:06:06.366421 sudo[1971]: pam_unix(sudo:session): session closed for user root Sep 5 06:06:06.372023 sshd[1970]: Connection closed by 139.178.89.65 port 42696 Sep 5 06:06:06.372378 sshd-session[1967]: pam_unix(sshd:session): session closed for user core Sep 5 06:06:06.375501 systemd[1]: sshd@6-139.178.70.103:22-139.178.89.65:42696.service: Deactivated successfully. Sep 5 06:06:06.378750 systemd[1]: session-9.scope: Deactivated successfully. Sep 5 06:06:06.379105 systemd[1]: session-9.scope: Consumed 2.975s CPU time, 154.7M memory peak. Sep 5 06:06:06.382127 systemd-logind[1625]: Session 9 logged out. Waiting for processes to exit. Sep 5 06:06:06.383198 systemd-logind[1625]: Removed session 9. Sep 5 06:06:09.386515 systemd[1]: Created slice kubepods-besteffort-pod3ddac055_a13d_4aed_8f44_74ec937ad3cd.slice - libcontainer container kubepods-besteffort-pod3ddac055_a13d_4aed_8f44_74ec937ad3cd.slice. Sep 5 06:06:09.446837 kubelet[2951]: I0905 06:06:09.446787 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ddac055-a13d-4aed-8f44-74ec937ad3cd-tigera-ca-bundle\") pod \"calico-typha-6447455799-pkljt\" (UID: \"3ddac055-a13d-4aed-8f44-74ec937ad3cd\") " pod="calico-system/calico-typha-6447455799-pkljt" Sep 5 06:06:09.446837 kubelet[2951]: I0905 06:06:09.446818 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/3ddac055-a13d-4aed-8f44-74ec937ad3cd-typha-certs\") pod \"calico-typha-6447455799-pkljt\" (UID: \"3ddac055-a13d-4aed-8f44-74ec937ad3cd\") " pod="calico-system/calico-typha-6447455799-pkljt" Sep 5 06:06:09.447539 kubelet[2951]: I0905 06:06:09.447488 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjqx\" (UniqueName: \"kubernetes.io/projected/3ddac055-a13d-4aed-8f44-74ec937ad3cd-kube-api-access-wcjqx\") pod \"calico-typha-6447455799-pkljt\" (UID: \"3ddac055-a13d-4aed-8f44-74ec937ad3cd\") " pod="calico-system/calico-typha-6447455799-pkljt" Sep 5 06:06:09.679191 systemd[1]: Created slice kubepods-besteffort-podd37b046a_12eb_4d02_ad48_e6b0f4dd3235.slice - libcontainer container kubepods-besteffort-podd37b046a_12eb_4d02_ad48_e6b0f4dd3235.slice. Sep 5 06:06:09.710719 containerd[1653]: time="2025-09-05T06:06:09.710356186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6447455799-pkljt,Uid:3ddac055-a13d-4aed-8f44-74ec937ad3cd,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:09.739162 containerd[1653]: time="2025-09-05T06:06:09.739063414Z" level=info msg="connecting to shim 96ca1e0373105bd036e16b1bdacdc2860796bc49d8005e64da5e42a1f0c660c0" address="unix:///run/containerd/s/d7ca48f2f6badcc74ba1c3ee6d5540f134868ba2ab69f4c8e7960f3afb61c9d6" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:09.750638 kubelet[2951]: I0905 06:06:09.750535 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-flexvol-driver-host\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.750638 kubelet[2951]: I0905 06:06:09.750580 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfs7z\" (UniqueName: \"kubernetes.io/projected/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-kube-api-access-sfs7z\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.750638 kubelet[2951]: I0905 06:06:09.750599 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-tigera-ca-bundle\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.751209 kubelet[2951]: I0905 06:06:09.750786 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-var-lib-calico\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.751209 kubelet[2951]: I0905 06:06:09.750809 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-var-run-calico\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.751209 kubelet[2951]: I0905 06:06:09.750821 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-xtables-lock\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.751209 kubelet[2951]: I0905 06:06:09.750831 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-node-certs\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.752595 kubelet[2951]: I0905 06:06:09.751511 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-cni-bin-dir\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.752595 kubelet[2951]: I0905 06:06:09.752478 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-cni-net-dir\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.752595 kubelet[2951]: I0905 06:06:09.752569 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-cni-log-dir\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.752595 kubelet[2951]: I0905 06:06:09.752581 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-policysync\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.752826 kubelet[2951]: I0905 06:06:09.752730 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d37b046a-12eb-4d02-ad48-e6b0f4dd3235-lib-modules\") pod \"calico-node-86ssv\" (UID: \"d37b046a-12eb-4d02-ad48-e6b0f4dd3235\") " pod="calico-system/calico-node-86ssv" Sep 5 06:06:09.770352 systemd[1]: Started cri-containerd-96ca1e0373105bd036e16b1bdacdc2860796bc49d8005e64da5e42a1f0c660c0.scope - libcontainer container 96ca1e0373105bd036e16b1bdacdc2860796bc49d8005e64da5e42a1f0c660c0. Sep 5 06:06:09.823511 containerd[1653]: time="2025-09-05T06:06:09.823485368Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6447455799-pkljt,Uid:3ddac055-a13d-4aed-8f44-74ec937ad3cd,Namespace:calico-system,Attempt:0,} returns sandbox id \"96ca1e0373105bd036e16b1bdacdc2860796bc49d8005e64da5e42a1f0c660c0\"" Sep 5 06:06:09.828437 containerd[1653]: time="2025-09-05T06:06:09.828412759Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 5 06:06:09.913581 kubelet[2951]: E0905 06:06:09.913333 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:09.913581 kubelet[2951]: W0905 06:06:09.913421 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:09.931474 kubelet[2951]: E0905 06:06:09.931374 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:09.983291 containerd[1653]: time="2025-09-05T06:06:09.983237955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-86ssv,Uid:d37b046a-12eb-4d02-ad48-e6b0f4dd3235,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:10.055698 kubelet[2951]: E0905 06:06:10.055511 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:10.128655 kubelet[2951]: E0905 06:06:10.128591 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.128655 kubelet[2951]: W0905 06:06:10.128609 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.128655 kubelet[2951]: E0905 06:06:10.128625 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.128950 kubelet[2951]: E0905 06:06:10.128916 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.128950 kubelet[2951]: W0905 06:06:10.128923 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.128950 kubelet[2951]: E0905 06:06:10.128930 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.129132 kubelet[2951]: E0905 06:06:10.129095 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.129132 kubelet[2951]: W0905 06:06:10.129102 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.129132 kubelet[2951]: E0905 06:06:10.129107 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.205824 kubelet[2951]: E0905 06:06:10.205740 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.205824 kubelet[2951]: W0905 06:06:10.205754 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.205824 kubelet[2951]: E0905 06:06:10.205771 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.206281 kubelet[2951]: E0905 06:06:10.206255 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.206281 kubelet[2951]: W0905 06:06:10.206262 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.206347 kubelet[2951]: E0905 06:06:10.206303 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.206347 kubelet[2951]: I0905 06:06:10.206320 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3aca1205-261a-414a-bbc5-de4a4cc8f072-kubelet-dir\") pod \"csi-node-driver-zmpmp\" (UID: \"3aca1205-261a-414a-bbc5-de4a4cc8f072\") " pod="calico-system/csi-node-driver-zmpmp" Sep 5 06:06:10.206706 kubelet[2951]: E0905 06:06:10.206570 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.206706 kubelet[2951]: W0905 06:06:10.206577 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.206706 kubelet[2951]: E0905 06:06:10.206590 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.206706 kubelet[2951]: E0905 06:06:10.206697 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.206706 kubelet[2951]: W0905 06:06:10.206702 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207047 kubelet[2951]: E0905 06:06:10.206714 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207047 kubelet[2951]: E0905 06:06:10.206869 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207047 kubelet[2951]: W0905 06:06:10.206874 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207047 kubelet[2951]: E0905 06:06:10.206879 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207047 kubelet[2951]: E0905 06:06:10.207019 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207047 kubelet[2951]: W0905 06:06:10.207024 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207047 kubelet[2951]: E0905 06:06:10.207029 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207161 kubelet[2951]: E0905 06:06:10.207104 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207161 kubelet[2951]: W0905 06:06:10.207109 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207161 kubelet[2951]: E0905 06:06:10.207113 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207710 kubelet[2951]: E0905 06:06:10.207205 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207710 kubelet[2951]: W0905 06:06:10.207209 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207710 kubelet[2951]: E0905 06:06:10.207214 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207710 kubelet[2951]: E0905 06:06:10.207310 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207710 kubelet[2951]: W0905 06:06:10.207315 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207710 kubelet[2951]: E0905 06:06:10.207320 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207710 kubelet[2951]: E0905 06:06:10.207394 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207710 kubelet[2951]: W0905 06:06:10.207399 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207710 kubelet[2951]: E0905 06:06:10.207403 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207710 kubelet[2951]: E0905 06:06:10.207476 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207976 kubelet[2951]: W0905 06:06:10.207481 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207976 kubelet[2951]: E0905 06:06:10.207486 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207976 kubelet[2951]: E0905 06:06:10.207600 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207976 kubelet[2951]: W0905 06:06:10.207604 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207976 kubelet[2951]: E0905 06:06:10.207609 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207976 kubelet[2951]: E0905 06:06:10.207733 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207976 kubelet[2951]: W0905 06:06:10.207738 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.207976 kubelet[2951]: E0905 06:06:10.207743 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.207976 kubelet[2951]: E0905 06:06:10.207829 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.207976 kubelet[2951]: W0905 06:06:10.207833 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.208290 kubelet[2951]: E0905 06:06:10.207838 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.208290 kubelet[2951]: E0905 06:06:10.207929 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.208290 kubelet[2951]: W0905 06:06:10.207934 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.208290 kubelet[2951]: E0905 06:06:10.207952 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.208290 kubelet[2951]: E0905 06:06:10.208044 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.208290 kubelet[2951]: W0905 06:06:10.208048 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.208290 kubelet[2951]: E0905 06:06:10.208053 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.208290 kubelet[2951]: E0905 06:06:10.208136 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.208290 kubelet[2951]: W0905 06:06:10.208141 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.208290 kubelet[2951]: E0905 06:06:10.208145 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.208571 kubelet[2951]: E0905 06:06:10.208254 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.208571 kubelet[2951]: W0905 06:06:10.208271 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.208571 kubelet[2951]: E0905 06:06:10.208279 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.208571 kubelet[2951]: E0905 06:06:10.208374 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.208571 kubelet[2951]: W0905 06:06:10.208379 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.208571 kubelet[2951]: E0905 06:06:10.208384 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.208571 kubelet[2951]: E0905 06:06:10.208469 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.208571 kubelet[2951]: W0905 06:06:10.208474 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.208571 kubelet[2951]: E0905 06:06:10.208479 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.235001 containerd[1653]: time="2025-09-05T06:06:10.234937259Z" level=info msg="connecting to shim 2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf" address="unix:///run/containerd/s/4574625e4e2e0501fbf203bea626e0bdd9bc550535588f9cb22406f388375f36" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:10.253345 systemd[1]: Started cri-containerd-2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf.scope - libcontainer container 2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf. Sep 5 06:06:10.294744 containerd[1653]: time="2025-09-05T06:06:10.294690002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-86ssv,Uid:d37b046a-12eb-4d02-ad48-e6b0f4dd3235,Namespace:calico-system,Attempt:0,} returns sandbox id \"2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf\"" Sep 5 06:06:10.307649 kubelet[2951]: E0905 06:06:10.307625 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.307776 kubelet[2951]: W0905 06:06:10.307641 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.307776 kubelet[2951]: E0905 06:06:10.307672 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.308019 kubelet[2951]: E0905 06:06:10.307997 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.308019 kubelet[2951]: W0905 06:06:10.308005 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.308019 kubelet[2951]: E0905 06:06:10.308015 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.308117 kubelet[2951]: I0905 06:06:10.308058 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3aca1205-261a-414a-bbc5-de4a4cc8f072-registration-dir\") pod \"csi-node-driver-zmpmp\" (UID: \"3aca1205-261a-414a-bbc5-de4a4cc8f072\") " pod="calico-system/csi-node-driver-zmpmp" Sep 5 06:06:10.309262 kubelet[2951]: E0905 06:06:10.308992 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.309262 kubelet[2951]: W0905 06:06:10.309007 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.309262 kubelet[2951]: E0905 06:06:10.309021 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.309262 kubelet[2951]: I0905 06:06:10.309192 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3aca1205-261a-414a-bbc5-de4a4cc8f072-varrun\") pod \"csi-node-driver-zmpmp\" (UID: \"3aca1205-261a-414a-bbc5-de4a4cc8f072\") " pod="calico-system/csi-node-driver-zmpmp" Sep 5 06:06:10.309578 kubelet[2951]: E0905 06:06:10.309530 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.309578 kubelet[2951]: W0905 06:06:10.309540 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.309578 kubelet[2951]: E0905 06:06:10.309551 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.309578 kubelet[2951]: I0905 06:06:10.309563 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3aca1205-261a-414a-bbc5-de4a4cc8f072-socket-dir\") pod \"csi-node-driver-zmpmp\" (UID: \"3aca1205-261a-414a-bbc5-de4a4cc8f072\") " pod="calico-system/csi-node-driver-zmpmp" Sep 5 06:06:10.310820 kubelet[2951]: E0905 06:06:10.310778 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.310820 kubelet[2951]: W0905 06:06:10.310792 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.311081 kubelet[2951]: E0905 06:06:10.311064 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.311363 kubelet[2951]: E0905 06:06:10.311288 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.311527 kubelet[2951]: W0905 06:06:10.311456 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.311582 kubelet[2951]: I0905 06:06:10.311233 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srhk\" (UniqueName: \"kubernetes.io/projected/3aca1205-261a-414a-bbc5-de4a4cc8f072-kube-api-access-5srhk\") pod \"csi-node-driver-zmpmp\" (UID: \"3aca1205-261a-414a-bbc5-de4a4cc8f072\") " pod="calico-system/csi-node-driver-zmpmp" Sep 5 06:06:10.311698 kubelet[2951]: E0905 06:06:10.311636 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.311907 kubelet[2951]: E0905 06:06:10.311859 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.311907 kubelet[2951]: W0905 06:06:10.311870 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.311907 kubelet[2951]: E0905 06:06:10.311898 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.312283 kubelet[2951]: E0905 06:06:10.312273 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.312368 kubelet[2951]: W0905 06:06:10.312358 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.312460 kubelet[2951]: E0905 06:06:10.312443 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.312635 kubelet[2951]: E0905 06:06:10.312627 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.312732 kubelet[2951]: W0905 06:06:10.312684 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.312732 kubelet[2951]: E0905 06:06:10.312708 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.312921 kubelet[2951]: E0905 06:06:10.312913 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.313016 kubelet[2951]: W0905 06:06:10.312965 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.313016 kubelet[2951]: E0905 06:06:10.312990 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.313277 kubelet[2951]: E0905 06:06:10.313215 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.313277 kubelet[2951]: W0905 06:06:10.313222 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.313277 kubelet[2951]: E0905 06:06:10.313243 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.313547 kubelet[2951]: E0905 06:06:10.313494 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.313547 kubelet[2951]: W0905 06:06:10.313504 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.313547 kubelet[2951]: E0905 06:06:10.313518 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.313707 kubelet[2951]: E0905 06:06:10.313694 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.313707 kubelet[2951]: W0905 06:06:10.313704 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.313802 kubelet[2951]: E0905 06:06:10.313718 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.313871 kubelet[2951]: E0905 06:06:10.313858 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.313871 kubelet[2951]: W0905 06:06:10.313865 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.313871 kubelet[2951]: E0905 06:06:10.313874 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.314027 kubelet[2951]: E0905 06:06:10.313966 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.314027 kubelet[2951]: W0905 06:06:10.313970 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.314027 kubelet[2951]: E0905 06:06:10.313976 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.314204 kubelet[2951]: E0905 06:06:10.314059 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.314204 kubelet[2951]: W0905 06:06:10.314064 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.314204 kubelet[2951]: E0905 06:06:10.314068 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.314204 kubelet[2951]: E0905 06:06:10.314153 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.314204 kubelet[2951]: W0905 06:06:10.314157 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.314204 kubelet[2951]: E0905 06:06:10.314162 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.412300 kubelet[2951]: E0905 06:06:10.412223 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.412300 kubelet[2951]: W0905 06:06:10.412235 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.412300 kubelet[2951]: E0905 06:06:10.412246 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797458 kubelet[2951]: E0905 06:06:10.412402 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797458 kubelet[2951]: W0905 06:06:10.412407 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797458 kubelet[2951]: E0905 06:06:10.412417 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797458 kubelet[2951]: E0905 06:06:10.412507 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797458 kubelet[2951]: W0905 06:06:10.412512 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797458 kubelet[2951]: E0905 06:06:10.412518 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797458 kubelet[2951]: E0905 06:06:10.412624 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797458 kubelet[2951]: W0905 06:06:10.412631 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797458 kubelet[2951]: E0905 06:06:10.412638 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797458 kubelet[2951]: E0905 06:06:10.412739 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797633 kubelet[2951]: W0905 06:06:10.412744 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797633 kubelet[2951]: E0905 06:06:10.412750 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797633 kubelet[2951]: E0905 06:06:10.412827 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797633 kubelet[2951]: W0905 06:06:10.412832 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797633 kubelet[2951]: E0905 06:06:10.412838 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797633 kubelet[2951]: E0905 06:06:10.413032 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797633 kubelet[2951]: W0905 06:06:10.413037 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797633 kubelet[2951]: E0905 06:06:10.413048 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797633 kubelet[2951]: E0905 06:06:10.413154 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797633 kubelet[2951]: W0905 06:06:10.413159 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797786 kubelet[2951]: E0905 06:06:10.413178 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797786 kubelet[2951]: E0905 06:06:10.413314 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797786 kubelet[2951]: W0905 06:06:10.413319 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797786 kubelet[2951]: E0905 06:06:10.413327 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797786 kubelet[2951]: E0905 06:06:10.413434 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797786 kubelet[2951]: W0905 06:06:10.413439 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797786 kubelet[2951]: E0905 06:06:10.413447 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.797786 kubelet[2951]: E0905 06:06:10.413561 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.797786 kubelet[2951]: W0905 06:06:10.413566 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.797786 kubelet[2951]: E0905 06:06:10.413574 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798325 kubelet[2951]: E0905 06:06:10.413649 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.798325 kubelet[2951]: W0905 06:06:10.413655 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.798325 kubelet[2951]: E0905 06:06:10.413661 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798325 kubelet[2951]: E0905 06:06:10.413836 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.798325 kubelet[2951]: W0905 06:06:10.413841 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.798325 kubelet[2951]: E0905 06:06:10.413911 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798325 kubelet[2951]: E0905 06:06:10.413945 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.798325 kubelet[2951]: W0905 06:06:10.413950 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.798325 kubelet[2951]: E0905 06:06:10.413957 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798325 kubelet[2951]: E0905 06:06:10.414033 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.798512 kubelet[2951]: W0905 06:06:10.414038 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.798512 kubelet[2951]: E0905 06:06:10.414050 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798512 kubelet[2951]: E0905 06:06:10.414132 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.798512 kubelet[2951]: W0905 06:06:10.414137 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.798512 kubelet[2951]: E0905 06:06:10.414145 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798512 kubelet[2951]: E0905 06:06:10.414613 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.798512 kubelet[2951]: W0905 06:06:10.414620 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.798512 kubelet[2951]: E0905 06:06:10.414631 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798512 kubelet[2951]: E0905 06:06:10.414783 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.798512 kubelet[2951]: W0905 06:06:10.414789 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.798665 kubelet[2951]: E0905 06:06:10.414797 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798665 kubelet[2951]: E0905 06:06:10.414897 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.798665 kubelet[2951]: W0905 06:06:10.414902 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.798665 kubelet[2951]: E0905 06:06:10.414907 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.798665 kubelet[2951]: E0905 06:06:10.798242 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.800813 kubelet[2951]: W0905 06:06:10.799868 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.800813 kubelet[2951]: E0905 06:06:10.799905 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:10.804618 kubelet[2951]: E0905 06:06:10.804566 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:10.804618 kubelet[2951]: W0905 06:06:10.804579 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:10.804618 kubelet[2951]: E0905 06:06:10.804592 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:11.625916 kubelet[2951]: E0905 06:06:11.625877 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:12.179317 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1910042241.mount: Deactivated successfully. Sep 5 06:06:13.388832 containerd[1653]: time="2025-09-05T06:06:13.388666576Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:13.394298 containerd[1653]: time="2025-09-05T06:06:13.394101609Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 5 06:06:13.406098 containerd[1653]: time="2025-09-05T06:06:13.406067965Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:13.413823 containerd[1653]: time="2025-09-05T06:06:13.413341767Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:13.413823 containerd[1653]: time="2025-09-05T06:06:13.413707023Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.585271391s" Sep 5 06:06:13.413823 containerd[1653]: time="2025-09-05T06:06:13.413721907Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 5 06:06:13.414590 containerd[1653]: time="2025-09-05T06:06:13.414575958Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 5 06:06:13.434212 containerd[1653]: time="2025-09-05T06:06:13.434001018Z" level=info msg="CreateContainer within sandbox \"96ca1e0373105bd036e16b1bdacdc2860796bc49d8005e64da5e42a1f0c660c0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 5 06:06:13.626049 kubelet[2951]: E0905 06:06:13.626005 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:13.993760 containerd[1653]: time="2025-09-05T06:06:13.993728231Z" level=info msg="Container f472fad7e53a38b94af315d887ee2bea4e6d5f37744d0b60caed7a7ccbc06e61: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:14.094209 containerd[1653]: time="2025-09-05T06:06:14.094126798Z" level=info msg="CreateContainer within sandbox \"96ca1e0373105bd036e16b1bdacdc2860796bc49d8005e64da5e42a1f0c660c0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f472fad7e53a38b94af315d887ee2bea4e6d5f37744d0b60caed7a7ccbc06e61\"" Sep 5 06:06:14.095014 containerd[1653]: time="2025-09-05T06:06:14.094818456Z" level=info msg="StartContainer for \"f472fad7e53a38b94af315d887ee2bea4e6d5f37744d0b60caed7a7ccbc06e61\"" Sep 5 06:06:14.107971 containerd[1653]: time="2025-09-05T06:06:14.097052722Z" level=info msg="connecting to shim f472fad7e53a38b94af315d887ee2bea4e6d5f37744d0b60caed7a7ccbc06e61" address="unix:///run/containerd/s/d7ca48f2f6badcc74ba1c3ee6d5540f134868ba2ab69f4c8e7960f3afb61c9d6" protocol=ttrpc version=3 Sep 5 06:06:14.128281 systemd[1]: Started cri-containerd-f472fad7e53a38b94af315d887ee2bea4e6d5f37744d0b60caed7a7ccbc06e61.scope - libcontainer container f472fad7e53a38b94af315d887ee2bea4e6d5f37744d0b60caed7a7ccbc06e61. Sep 5 06:06:14.201075 containerd[1653]: time="2025-09-05T06:06:14.201047489Z" level=info msg="StartContainer for \"f472fad7e53a38b94af315d887ee2bea4e6d5f37744d0b60caed7a7ccbc06e61\" returns successfully" Sep 5 06:06:14.736067 kubelet[2951]: E0905 06:06:14.736034 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.736067 kubelet[2951]: W0905 06:06:14.736057 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.736471 kubelet[2951]: E0905 06:06:14.736105 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.736471 kubelet[2951]: E0905 06:06:14.736264 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.736471 kubelet[2951]: W0905 06:06:14.736271 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.736471 kubelet[2951]: E0905 06:06:14.736280 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.736471 kubelet[2951]: E0905 06:06:14.736407 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.736471 kubelet[2951]: W0905 06:06:14.736412 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.736471 kubelet[2951]: E0905 06:06:14.736418 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.746471 kubelet[2951]: E0905 06:06:14.736572 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.746471 kubelet[2951]: W0905 06:06:14.736580 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.746471 kubelet[2951]: E0905 06:06:14.736587 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.746471 kubelet[2951]: E0905 06:06:14.736700 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.746471 kubelet[2951]: W0905 06:06:14.736716 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.746471 kubelet[2951]: E0905 06:06:14.736726 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.746471 kubelet[2951]: E0905 06:06:14.736847 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.746471 kubelet[2951]: W0905 06:06:14.736854 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.746471 kubelet[2951]: E0905 06:06:14.736860 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.746471 kubelet[2951]: E0905 06:06:14.736997 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.746847 kubelet[2951]: W0905 06:06:14.737004 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.746847 kubelet[2951]: E0905 06:06:14.737012 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.746847 kubelet[2951]: E0905 06:06:14.737205 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.746847 kubelet[2951]: W0905 06:06:14.737212 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.746847 kubelet[2951]: E0905 06:06:14.737223 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.746847 kubelet[2951]: E0905 06:06:14.737366 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.746847 kubelet[2951]: W0905 06:06:14.737373 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.746847 kubelet[2951]: E0905 06:06:14.737380 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.746847 kubelet[2951]: E0905 06:06:14.737492 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.746847 kubelet[2951]: W0905 06:06:14.737497 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.747054 kubelet[2951]: E0905 06:06:14.737503 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.747054 kubelet[2951]: E0905 06:06:14.737606 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.747054 kubelet[2951]: W0905 06:06:14.737611 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.747054 kubelet[2951]: E0905 06:06:14.737616 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.747054 kubelet[2951]: E0905 06:06:14.737725 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.747054 kubelet[2951]: W0905 06:06:14.737729 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.747054 kubelet[2951]: E0905 06:06:14.737735 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.747054 kubelet[2951]: E0905 06:06:14.737859 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.747054 kubelet[2951]: W0905 06:06:14.737866 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.747054 kubelet[2951]: E0905 06:06:14.737872 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773398 kubelet[2951]: E0905 06:06:14.738017 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773398 kubelet[2951]: W0905 06:06:14.738023 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773398 kubelet[2951]: E0905 06:06:14.738030 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773398 kubelet[2951]: E0905 06:06:14.738148 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773398 kubelet[2951]: W0905 06:06:14.738155 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773398 kubelet[2951]: E0905 06:06:14.738162 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773398 kubelet[2951]: E0905 06:06:14.741340 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773398 kubelet[2951]: W0905 06:06:14.741352 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773398 kubelet[2951]: E0905 06:06:14.741368 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773398 kubelet[2951]: E0905 06:06:14.741516 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773644 kubelet[2951]: W0905 06:06:14.741648 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773644 kubelet[2951]: E0905 06:06:14.741664 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773644 kubelet[2951]: E0905 06:06:14.741910 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773644 kubelet[2951]: W0905 06:06:14.741916 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773644 kubelet[2951]: E0905 06:06:14.741922 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773644 kubelet[2951]: E0905 06:06:14.742283 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773644 kubelet[2951]: W0905 06:06:14.742291 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773644 kubelet[2951]: E0905 06:06:14.742300 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773644 kubelet[2951]: E0905 06:06:14.742600 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773644 kubelet[2951]: W0905 06:06:14.742608 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773840 kubelet[2951]: E0905 06:06:14.742617 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773840 kubelet[2951]: E0905 06:06:14.742795 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773840 kubelet[2951]: W0905 06:06:14.742801 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773840 kubelet[2951]: E0905 06:06:14.742842 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773840 kubelet[2951]: E0905 06:06:14.742919 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773840 kubelet[2951]: W0905 06:06:14.742926 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773840 kubelet[2951]: E0905 06:06:14.742992 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.773840 kubelet[2951]: E0905 06:06:14.743082 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.773840 kubelet[2951]: W0905 06:06:14.743087 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.773840 kubelet[2951]: E0905 06:06:14.743097 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774086 kubelet[2951]: E0905 06:06:14.743270 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774086 kubelet[2951]: W0905 06:06:14.743278 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774086 kubelet[2951]: E0905 06:06:14.743293 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774086 kubelet[2951]: E0905 06:06:14.743519 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774086 kubelet[2951]: W0905 06:06:14.743528 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774086 kubelet[2951]: E0905 06:06:14.743538 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774086 kubelet[2951]: E0905 06:06:14.743662 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774086 kubelet[2951]: W0905 06:06:14.743669 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774086 kubelet[2951]: E0905 06:06:14.743677 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774086 kubelet[2951]: E0905 06:06:14.743992 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774409 kubelet[2951]: W0905 06:06:14.743999 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774409 kubelet[2951]: E0905 06:06:14.744007 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774409 kubelet[2951]: E0905 06:06:14.744503 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774409 kubelet[2951]: W0905 06:06:14.744570 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774409 kubelet[2951]: E0905 06:06:14.744583 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774409 kubelet[2951]: E0905 06:06:14.744768 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774409 kubelet[2951]: W0905 06:06:14.744777 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774409 kubelet[2951]: E0905 06:06:14.744786 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774409 kubelet[2951]: E0905 06:06:14.745030 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774409 kubelet[2951]: W0905 06:06:14.745039 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774641 kubelet[2951]: E0905 06:06:14.745047 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774641 kubelet[2951]: E0905 06:06:14.745373 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774641 kubelet[2951]: W0905 06:06:14.745381 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774641 kubelet[2951]: E0905 06:06:14.745392 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774641 kubelet[2951]: E0905 06:06:14.746138 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774641 kubelet[2951]: W0905 06:06:14.746151 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774641 kubelet[2951]: E0905 06:06:14.746162 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:14.774641 kubelet[2951]: E0905 06:06:14.758775 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:14.774641 kubelet[2951]: W0905 06:06:14.758794 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:14.774641 kubelet[2951]: E0905 06:06:14.758814 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.626193 kubelet[2951]: E0905 06:06:15.626144 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:15.636140 containerd[1653]: time="2025-09-05T06:06:15.636105668Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:15.648994 containerd[1653]: time="2025-09-05T06:06:15.648804846Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 5 06:06:15.664127 containerd[1653]: time="2025-09-05T06:06:15.664092175Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:15.678857 containerd[1653]: time="2025-09-05T06:06:15.678807267Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:15.679308 containerd[1653]: time="2025-09-05T06:06:15.679082584Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 2.264489955s" Sep 5 06:06:15.679308 containerd[1653]: time="2025-09-05T06:06:15.679102750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 5 06:06:15.681188 containerd[1653]: time="2025-09-05T06:06:15.680396460Z" level=info msg="CreateContainer within sandbox \"2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 5 06:06:15.694801 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount444303652.mount: Deactivated successfully. Sep 5 06:06:15.697318 containerd[1653]: time="2025-09-05T06:06:15.697292286Z" level=info msg="Container bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:15.702595 containerd[1653]: time="2025-09-05T06:06:15.702566411Z" level=info msg="CreateContainer within sandbox \"2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed\"" Sep 5 06:06:15.703058 containerd[1653]: time="2025-09-05T06:06:15.703040990Z" level=info msg="StartContainer for \"bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed\"" Sep 5 06:06:15.705280 containerd[1653]: time="2025-09-05T06:06:15.705248011Z" level=info msg="connecting to shim bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed" address="unix:///run/containerd/s/4574625e4e2e0501fbf203bea626e0bdd9bc550535588f9cb22406f388375f36" protocol=ttrpc version=3 Sep 5 06:06:15.720310 kubelet[2951]: I0905 06:06:15.720255 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:06:15.733418 systemd[1]: Started cri-containerd-bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed.scope - libcontainer container bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed. Sep 5 06:06:15.745988 kubelet[2951]: E0905 06:06:15.745914 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.745988 kubelet[2951]: W0905 06:06:15.745935 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.745988 kubelet[2951]: E0905 06:06:15.745953 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.746392 kubelet[2951]: E0905 06:06:15.746356 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.746392 kubelet[2951]: W0905 06:06:15.746363 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.746392 kubelet[2951]: E0905 06:06:15.746369 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.746553 kubelet[2951]: E0905 06:06:15.746515 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.746553 kubelet[2951]: W0905 06:06:15.746521 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.746553 kubelet[2951]: E0905 06:06:15.746527 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.746734 kubelet[2951]: E0905 06:06:15.746707 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.746734 kubelet[2951]: W0905 06:06:15.746713 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.746734 kubelet[2951]: E0905 06:06:15.746718 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.746904 kubelet[2951]: E0905 06:06:15.746872 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.746904 kubelet[2951]: W0905 06:06:15.746878 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.746904 kubelet[2951]: E0905 06:06:15.746883 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.747077 kubelet[2951]: E0905 06:06:15.747048 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.747077 kubelet[2951]: W0905 06:06:15.747053 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.747077 kubelet[2951]: E0905 06:06:15.747059 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.747269 kubelet[2951]: E0905 06:06:15.747235 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.747269 kubelet[2951]: W0905 06:06:15.747242 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.747269 kubelet[2951]: E0905 06:06:15.747247 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.747455 kubelet[2951]: E0905 06:06:15.747418 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.747455 kubelet[2951]: W0905 06:06:15.747424 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.747455 kubelet[2951]: E0905 06:06:15.747430 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.747645 kubelet[2951]: E0905 06:06:15.747609 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.747645 kubelet[2951]: W0905 06:06:15.747616 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.747645 kubelet[2951]: E0905 06:06:15.747621 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.747819 kubelet[2951]: E0905 06:06:15.747788 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.747819 kubelet[2951]: W0905 06:06:15.747794 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.747819 kubelet[2951]: E0905 06:06:15.747800 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.747978 kubelet[2951]: E0905 06:06:15.747947 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.747978 kubelet[2951]: W0905 06:06:15.747953 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.747978 kubelet[2951]: E0905 06:06:15.747959 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.748141 kubelet[2951]: E0905 06:06:15.748105 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.748141 kubelet[2951]: W0905 06:06:15.748111 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.748141 kubelet[2951]: E0905 06:06:15.748116 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.748353 kubelet[2951]: E0905 06:06:15.748348 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753247 kubelet[2951]: W0905 06:06:15.748501 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753247 kubelet[2951]: E0905 06:06:15.748509 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753247 kubelet[2951]: E0905 06:06:15.748625 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753247 kubelet[2951]: W0905 06:06:15.748632 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753247 kubelet[2951]: E0905 06:06:15.748639 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753247 kubelet[2951]: E0905 06:06:15.748746 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753247 kubelet[2951]: W0905 06:06:15.748753 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753247 kubelet[2951]: E0905 06:06:15.748760 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753247 kubelet[2951]: E0905 06:06:15.751005 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753247 kubelet[2951]: W0905 06:06:15.751011 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753438 kubelet[2951]: E0905 06:06:15.751018 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753438 kubelet[2951]: E0905 06:06:15.751106 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753438 kubelet[2951]: W0905 06:06:15.751110 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753438 kubelet[2951]: E0905 06:06:15.751116 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753438 kubelet[2951]: E0905 06:06:15.751217 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753438 kubelet[2951]: W0905 06:06:15.751221 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753438 kubelet[2951]: E0905 06:06:15.751227 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753438 kubelet[2951]: E0905 06:06:15.751321 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753438 kubelet[2951]: W0905 06:06:15.751325 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753438 kubelet[2951]: E0905 06:06:15.751330 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753604 kubelet[2951]: E0905 06:06:15.751409 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753604 kubelet[2951]: W0905 06:06:15.751413 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753604 kubelet[2951]: E0905 06:06:15.751428 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753604 kubelet[2951]: E0905 06:06:15.751511 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753604 kubelet[2951]: W0905 06:06:15.751518 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753604 kubelet[2951]: E0905 06:06:15.751524 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753604 kubelet[2951]: E0905 06:06:15.751622 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753604 kubelet[2951]: W0905 06:06:15.751627 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753604 kubelet[2951]: E0905 06:06:15.751632 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753604 kubelet[2951]: E0905 06:06:15.751888 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753994 kubelet[2951]: W0905 06:06:15.751895 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753994 kubelet[2951]: E0905 06:06:15.751905 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753994 kubelet[2951]: E0905 06:06:15.752200 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753994 kubelet[2951]: W0905 06:06:15.752209 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753994 kubelet[2951]: E0905 06:06:15.752223 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753994 kubelet[2951]: E0905 06:06:15.753637 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753994 kubelet[2951]: W0905 06:06:15.753647 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.753994 kubelet[2951]: E0905 06:06:15.753662 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.753994 kubelet[2951]: E0905 06:06:15.753885 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.753994 kubelet[2951]: W0905 06:06:15.753892 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.754144 kubelet[2951]: E0905 06:06:15.754002 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.754144 kubelet[2951]: W0905 06:06:15.754008 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.754144 kubelet[2951]: E0905 06:06:15.754082 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.754144 kubelet[2951]: W0905 06:06:15.754087 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.754144 kubelet[2951]: E0905 06:06:15.754092 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.754239 kubelet[2951]: E0905 06:06:15.754196 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.754239 kubelet[2951]: W0905 06:06:15.754201 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.754239 kubelet[2951]: E0905 06:06:15.754206 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.754729 kubelet[2951]: E0905 06:06:15.754288 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.754729 kubelet[2951]: W0905 06:06:15.754295 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.754729 kubelet[2951]: E0905 06:06:15.754315 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.754729 kubelet[2951]: E0905 06:06:15.754417 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.754729 kubelet[2951]: E0905 06:06:15.754424 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.754729 kubelet[2951]: W0905 06:06:15.754429 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.754729 kubelet[2951]: E0905 06:06:15.754434 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.754729 kubelet[2951]: E0905 06:06:15.754616 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.756311 kubelet[2951]: E0905 06:06:15.756303 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.757835 kubelet[2951]: W0905 06:06:15.757568 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.757835 kubelet[2951]: E0905 06:06:15.757711 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.757974 kubelet[2951]: E0905 06:06:15.757967 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:06:15.758014 kubelet[2951]: W0905 06:06:15.758008 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:06:15.758053 kubelet[2951]: E0905 06:06:15.758047 2951 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:06:15.775463 containerd[1653]: time="2025-09-05T06:06:15.775442442Z" level=info msg="StartContainer for \"bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed\" returns successfully" Sep 5 06:06:15.778000 systemd[1]: cri-containerd-bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed.scope: Deactivated successfully. Sep 5 06:06:15.786998 containerd[1653]: time="2025-09-05T06:06:15.786908775Z" level=info msg="received exit event container_id:\"bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed\" id:\"bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed\" pid:3613 exited_at:{seconds:1757052375 nanos:780865643}" Sep 5 06:06:15.808637 containerd[1653]: time="2025-09-05T06:06:15.808609613Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed\" id:\"bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed\" pid:3613 exited_at:{seconds:1757052375 nanos:780865643}" Sep 5 06:06:15.813949 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bcf8b8e5132060c53da597fca74ff4a5e8b28951b0505c92b20d2798e65b6bed-rootfs.mount: Deactivated successfully. Sep 5 06:06:16.721442 containerd[1653]: time="2025-09-05T06:06:16.721411571Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 5 06:06:16.741184 kubelet[2951]: I0905 06:06:16.740982 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6447455799-pkljt" podStartSLOduration=4.154640211 podStartE2EDuration="7.740965313s" podCreationTimestamp="2025-09-05 06:06:09 +0000 UTC" firstStartedPulling="2025-09-05 06:06:09.827930769 +0000 UTC m=+21.341874340" lastFinishedPulling="2025-09-05 06:06:13.414255873 +0000 UTC m=+24.928199442" observedRunningTime="2025-09-05 06:06:14.747620131 +0000 UTC m=+26.261563712" watchObservedRunningTime="2025-09-05 06:06:16.740965313 +0000 UTC m=+28.254908886" Sep 5 06:06:17.626177 kubelet[2951]: E0905 06:06:17.626002 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:19.625705 kubelet[2951]: E0905 06:06:19.625458 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:20.294441 containerd[1653]: time="2025-09-05T06:06:20.294254318Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:20.306073 containerd[1653]: time="2025-09-05T06:06:20.306046959Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 5 06:06:20.317010 containerd[1653]: time="2025-09-05T06:06:20.316951450Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:20.325710 containerd[1653]: time="2025-09-05T06:06:20.325673786Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:20.326526 containerd[1653]: time="2025-09-05T06:06:20.326508227Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.604991444s" Sep 5 06:06:20.326571 containerd[1653]: time="2025-09-05T06:06:20.326530060Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 5 06:06:20.328355 containerd[1653]: time="2025-09-05T06:06:20.328306718Z" level=info msg="CreateContainer within sandbox \"2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 5 06:06:20.647086 containerd[1653]: time="2025-09-05T06:06:20.646296159Z" level=info msg="Container 1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:20.757476 containerd[1653]: time="2025-09-05T06:06:20.757451908Z" level=info msg="CreateContainer within sandbox \"2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd\"" Sep 5 06:06:20.758271 containerd[1653]: time="2025-09-05T06:06:20.758242824Z" level=info msg="StartContainer for \"1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd\"" Sep 5 06:06:20.759023 containerd[1653]: time="2025-09-05T06:06:20.759008758Z" level=info msg="connecting to shim 1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd" address="unix:///run/containerd/s/4574625e4e2e0501fbf203bea626e0bdd9bc550535588f9cb22406f388375f36" protocol=ttrpc version=3 Sep 5 06:06:20.781338 systemd[1]: Started cri-containerd-1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd.scope - libcontainer container 1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd. Sep 5 06:06:20.821840 containerd[1653]: time="2025-09-05T06:06:20.821744159Z" level=info msg="StartContainer for \"1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd\" returns successfully" Sep 5 06:06:21.625527 kubelet[2951]: E0905 06:06:21.625479 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:23.625913 kubelet[2951]: E0905 06:06:23.625634 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:24.137800 systemd[1]: cri-containerd-1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd.scope: Deactivated successfully. Sep 5 06:06:24.137997 systemd[1]: cri-containerd-1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd.scope: Consumed 352ms CPU time, 166.7M memory peak, 2.3M read from disk, 171.3M written to disk. Sep 5 06:06:24.205962 kubelet[2951]: I0905 06:06:24.205897 2951 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 5 06:06:24.298844 containerd[1653]: time="2025-09-05T06:06:24.298816091Z" level=info msg="received exit event container_id:\"1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd\" id:\"1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd\" pid:3703 exited_at:{seconds:1757052384 nanos:298673327}" Sep 5 06:06:24.299091 containerd[1653]: time="2025-09-05T06:06:24.299075954Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd\" id:\"1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd\" pid:3703 exited_at:{seconds:1757052384 nanos:298673327}" Sep 5 06:06:24.356625 systemd[1]: Created slice kubepods-burstable-podd6254f86_4e8b_4f74_af07_6b68e974d028.slice - libcontainer container kubepods-burstable-podd6254f86_4e8b_4f74_af07_6b68e974d028.slice. Sep 5 06:06:24.365943 systemd[1]: Created slice kubepods-besteffort-pod0e471d34_64ab_423f_a035_550b2b064a5c.slice - libcontainer container kubepods-besteffort-pod0e471d34_64ab_423f_a035_550b2b064a5c.slice. Sep 5 06:06:24.373911 systemd[1]: Created slice kubepods-besteffort-pod6cf24baf_cb0a_415b_9c6a_8c1d4a98d753.slice - libcontainer container kubepods-besteffort-pod6cf24baf_cb0a_415b_9c6a_8c1d4a98d753.slice. Sep 5 06:06:24.380054 systemd[1]: Created slice kubepods-besteffort-pod098f5efb_c722_4a99_8289_c0a4a7d28760.slice - libcontainer container kubepods-besteffort-pod098f5efb_c722_4a99_8289_c0a4a7d28760.slice. Sep 5 06:06:24.387740 systemd[1]: Created slice kubepods-besteffort-pod4946ba55_c57c_47d1_a863_4c7bd324702d.slice - libcontainer container kubepods-besteffort-pod4946ba55_c57c_47d1_a863_4c7bd324702d.slice. Sep 5 06:06:24.394159 systemd[1]: Created slice kubepods-besteffort-poddb55fca1_860d_422e_9c2e_072372ad5501.slice - libcontainer container kubepods-besteffort-poddb55fca1_860d_422e_9c2e_072372ad5501.slice. Sep 5 06:06:24.398275 systemd[1]: Created slice kubepods-burstable-pod3484328b_3fb2_41ea_b562_a71f1ba2782c.slice - libcontainer container kubepods-burstable-pod3484328b_3fb2_41ea_b562_a71f1ba2782c.slice. Sep 5 06:06:24.405770 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1b20649b83396476746bba7a2c7aab23126832bc5668d1ee604defc5ff592dfd-rootfs.mount: Deactivated successfully. Sep 5 06:06:24.441444 kubelet[2951]: I0905 06:06:24.441408 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/098f5efb-c722-4a99-8289-c0a4a7d28760-whisker-ca-bundle\") pod \"whisker-7656979696-qwpvc\" (UID: \"098f5efb-c722-4a99-8289-c0a4a7d28760\") " pod="calico-system/whisker-7656979696-qwpvc" Sep 5 06:06:24.441444 kubelet[2951]: I0905 06:06:24.441447 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e471d34-64ab-423f-a035-550b2b064a5c-tigera-ca-bundle\") pod \"calico-kube-controllers-5cc6595ff6-qlwqn\" (UID: \"0e471d34-64ab-423f-a035-550b2b064a5c\") " pod="calico-system/calico-kube-controllers-5cc6595ff6-qlwqn" Sep 5 06:06:24.441629 kubelet[2951]: I0905 06:06:24.441471 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/6cf24baf-cb0a-415b-9c6a-8c1d4a98d753-goldmane-key-pair\") pod \"goldmane-54d579b49d-mzgr6\" (UID: \"6cf24baf-cb0a-415b-9c6a-8c1d4a98d753\") " pod="calico-system/goldmane-54d579b49d-mzgr6" Sep 5 06:06:24.441629 kubelet[2951]: I0905 06:06:24.441484 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6254f86-4e8b-4f74-af07-6b68e974d028-config-volume\") pod \"coredns-668d6bf9bc-rxtvx\" (UID: \"d6254f86-4e8b-4f74-af07-6b68e974d028\") " pod="kube-system/coredns-668d6bf9bc-rxtvx" Sep 5 06:06:24.441629 kubelet[2951]: I0905 06:06:24.441497 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mnk\" (UniqueName: \"kubernetes.io/projected/098f5efb-c722-4a99-8289-c0a4a7d28760-kube-api-access-g9mnk\") pod \"whisker-7656979696-qwpvc\" (UID: \"098f5efb-c722-4a99-8289-c0a4a7d28760\") " pod="calico-system/whisker-7656979696-qwpvc" Sep 5 06:06:24.441629 kubelet[2951]: I0905 06:06:24.441507 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cf24baf-cb0a-415b-9c6a-8c1d4a98d753-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-mzgr6\" (UID: \"6cf24baf-cb0a-415b-9c6a-8c1d4a98d753\") " pod="calico-system/goldmane-54d579b49d-mzgr6" Sep 5 06:06:24.441629 kubelet[2951]: I0905 06:06:24.441520 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnn7\" (UniqueName: \"kubernetes.io/projected/6cf24baf-cb0a-415b-9c6a-8c1d4a98d753-kube-api-access-wwnn7\") pod \"goldmane-54d579b49d-mzgr6\" (UID: \"6cf24baf-cb0a-415b-9c6a-8c1d4a98d753\") " pod="calico-system/goldmane-54d579b49d-mzgr6" Sep 5 06:06:24.442098 kubelet[2951]: I0905 06:06:24.441531 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/098f5efb-c722-4a99-8289-c0a4a7d28760-whisker-backend-key-pair\") pod \"whisker-7656979696-qwpvc\" (UID: \"098f5efb-c722-4a99-8289-c0a4a7d28760\") " pod="calico-system/whisker-7656979696-qwpvc" Sep 5 06:06:24.442098 kubelet[2951]: I0905 06:06:24.441547 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq79\" (UniqueName: \"kubernetes.io/projected/0e471d34-64ab-423f-a035-550b2b064a5c-kube-api-access-pmq79\") pod \"calico-kube-controllers-5cc6595ff6-qlwqn\" (UID: \"0e471d34-64ab-423f-a035-550b2b064a5c\") " pod="calico-system/calico-kube-controllers-5cc6595ff6-qlwqn" Sep 5 06:06:24.442098 kubelet[2951]: I0905 06:06:24.441560 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4dzq\" (UniqueName: \"kubernetes.io/projected/d6254f86-4e8b-4f74-af07-6b68e974d028-kube-api-access-z4dzq\") pod \"coredns-668d6bf9bc-rxtvx\" (UID: \"d6254f86-4e8b-4f74-af07-6b68e974d028\") " pod="kube-system/coredns-668d6bf9bc-rxtvx" Sep 5 06:06:24.442098 kubelet[2951]: I0905 06:06:24.441573 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cf24baf-cb0a-415b-9c6a-8c1d4a98d753-config\") pod \"goldmane-54d579b49d-mzgr6\" (UID: \"6cf24baf-cb0a-415b-9c6a-8c1d4a98d753\") " pod="calico-system/goldmane-54d579b49d-mzgr6" Sep 5 06:06:24.541922 kubelet[2951]: I0905 06:06:24.541869 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggctx\" (UniqueName: \"kubernetes.io/projected/4946ba55-c57c-47d1-a863-4c7bd324702d-kube-api-access-ggctx\") pod \"calico-apiserver-7d9487c8bd-xllpf\" (UID: \"4946ba55-c57c-47d1-a863-4c7bd324702d\") " pod="calico-apiserver/calico-apiserver-7d9487c8bd-xllpf" Sep 5 06:06:24.541922 kubelet[2951]: I0905 06:06:24.541915 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/db55fca1-860d-422e-9c2e-072372ad5501-calico-apiserver-certs\") pod \"calico-apiserver-7d9487c8bd-m6f9c\" (UID: \"db55fca1-860d-422e-9c2e-072372ad5501\") " pod="calico-apiserver/calico-apiserver-7d9487c8bd-m6f9c" Sep 5 06:06:24.542046 kubelet[2951]: I0905 06:06:24.541946 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj9lf\" (UniqueName: \"kubernetes.io/projected/3484328b-3fb2-41ea-b562-a71f1ba2782c-kube-api-access-qj9lf\") pod \"coredns-668d6bf9bc-k95rr\" (UID: \"3484328b-3fb2-41ea-b562-a71f1ba2782c\") " pod="kube-system/coredns-668d6bf9bc-k95rr" Sep 5 06:06:24.542046 kubelet[2951]: I0905 06:06:24.541965 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msg74\" (UniqueName: \"kubernetes.io/projected/db55fca1-860d-422e-9c2e-072372ad5501-kube-api-access-msg74\") pod \"calico-apiserver-7d9487c8bd-m6f9c\" (UID: \"db55fca1-860d-422e-9c2e-072372ad5501\") " pod="calico-apiserver/calico-apiserver-7d9487c8bd-m6f9c" Sep 5 06:06:24.542046 kubelet[2951]: I0905 06:06:24.541980 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4946ba55-c57c-47d1-a863-4c7bd324702d-calico-apiserver-certs\") pod \"calico-apiserver-7d9487c8bd-xllpf\" (UID: \"4946ba55-c57c-47d1-a863-4c7bd324702d\") " pod="calico-apiserver/calico-apiserver-7d9487c8bd-xllpf" Sep 5 06:06:24.542046 kubelet[2951]: I0905 06:06:24.541997 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3484328b-3fb2-41ea-b562-a71f1ba2782c-config-volume\") pod \"coredns-668d6bf9bc-k95rr\" (UID: \"3484328b-3fb2-41ea-b562-a71f1ba2782c\") " pod="kube-system/coredns-668d6bf9bc-k95rr" Sep 5 06:06:24.666608 containerd[1653]: time="2025-09-05T06:06:24.665222777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-rxtvx,Uid:d6254f86-4e8b-4f74-af07-6b68e974d028,Namespace:kube-system,Attempt:0,}" Sep 5 06:06:24.669498 containerd[1653]: time="2025-09-05T06:06:24.669482834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cc6595ff6-qlwqn,Uid:0e471d34-64ab-423f-a035-550b2b064a5c,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:24.681979 containerd[1653]: time="2025-09-05T06:06:24.681959191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mzgr6,Uid:6cf24baf-cb0a-415b-9c6a-8c1d4a98d753,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:24.686485 containerd[1653]: time="2025-09-05T06:06:24.686463268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7656979696-qwpvc,Uid:098f5efb-c722-4a99-8289-c0a4a7d28760,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:24.707044 containerd[1653]: time="2025-09-05T06:06:24.707011170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-k95rr,Uid:3484328b-3fb2-41ea-b562-a71f1ba2782c,Namespace:kube-system,Attempt:0,}" Sep 5 06:06:24.707696 containerd[1653]: time="2025-09-05T06:06:24.707679779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d9487c8bd-xllpf,Uid:4946ba55-c57c-47d1-a863-4c7bd324702d,Namespace:calico-apiserver,Attempt:0,}" Sep 5 06:06:24.707780 containerd[1653]: time="2025-09-05T06:06:24.707755033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d9487c8bd-m6f9c,Uid:db55fca1-860d-422e-9c2e-072372ad5501,Namespace:calico-apiserver,Attempt:0,}" Sep 5 06:06:24.787856 containerd[1653]: time="2025-09-05T06:06:24.787823868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 5 06:06:25.137324 containerd[1653]: time="2025-09-05T06:06:25.137287813Z" level=error msg="Failed to destroy network for sandbox \"35365906b7d3b60cafa3578c825064b0b891cc208b58c933c4044ef4c5cdce42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.138668 containerd[1653]: time="2025-09-05T06:06:25.138620092Z" level=error msg="Failed to destroy network for sandbox \"ad55259420c9d8a2c29f60b9af0e04bd04d13340d80f70b6ebe55ad8b4c7b8ef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.142687 containerd[1653]: time="2025-09-05T06:06:25.138937939Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mzgr6,Uid:6cf24baf-cb0a-415b-9c6a-8c1d4a98d753,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"35365906b7d3b60cafa3578c825064b0b891cc208b58c933c4044ef4c5cdce42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.142687 containerd[1653]: time="2025-09-05T06:06:25.141741033Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7656979696-qwpvc,Uid:098f5efb-c722-4a99-8289-c0a4a7d28760,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad55259420c9d8a2c29f60b9af0e04bd04d13340d80f70b6ebe55ad8b4c7b8ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.154184 containerd[1653]: time="2025-09-05T06:06:25.154117364Z" level=error msg="Failed to destroy network for sandbox \"dccdbf857ed70b52636d00b8f763a3038db66303856714552ea0cf03d00139bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.155196 containerd[1653]: time="2025-09-05T06:06:25.154554706Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d9487c8bd-xllpf,Uid:4946ba55-c57c-47d1-a863-4c7bd324702d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dccdbf857ed70b52636d00b8f763a3038db66303856714552ea0cf03d00139bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.157312 containerd[1653]: time="2025-09-05T06:06:25.157283664Z" level=error msg="Failed to destroy network for sandbox \"a41d7db4f6b77b0c39c0ea8c991d170142d70735740ef20110a26b822e6f638f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.157549 containerd[1653]: time="2025-09-05T06:06:25.157535587Z" level=error msg="Failed to destroy network for sandbox \"f2663d0f1625065578efa567a85ec7ea376568a92f307f831587b0998974b145\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.157985 containerd[1653]: time="2025-09-05T06:06:25.157905716Z" level=error msg="Failed to destroy network for sandbox \"92aa612eb4416bc7fcc9c2fbdfb9446e68ae861943b9ba7f10e7efeb6c1579d3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.158193 containerd[1653]: time="2025-09-05T06:06:25.158141849Z" level=error msg="Failed to destroy network for sandbox \"a6f36e8118db30ceec4c3657a7b15ae8f6d650c91cdd8bfd5125f4c731a5c1c8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.159129 containerd[1653]: time="2025-09-05T06:06:25.159112358Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-k95rr,Uid:3484328b-3fb2-41ea-b562-a71f1ba2782c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2663d0f1625065578efa567a85ec7ea376568a92f307f831587b0998974b145\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.160439 containerd[1653]: time="2025-09-05T06:06:25.160417934Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d9487c8bd-m6f9c,Uid:db55fca1-860d-422e-9c2e-072372ad5501,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"92aa612eb4416bc7fcc9c2fbdfb9446e68ae861943b9ba7f10e7efeb6c1579d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.161314 containerd[1653]: time="2025-09-05T06:06:25.161297815Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-rxtvx,Uid:d6254f86-4e8b-4f74-af07-6b68e974d028,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a41d7db4f6b77b0c39c0ea8c991d170142d70735740ef20110a26b822e6f638f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.161684 containerd[1653]: time="2025-09-05T06:06:25.161652027Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cc6595ff6-qlwqn,Uid:0e471d34-64ab-423f-a035-550b2b064a5c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6f36e8118db30ceec4c3657a7b15ae8f6d650c91cdd8bfd5125f4c731a5c1c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.162151 kubelet[2951]: E0905 06:06:25.162120 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dccdbf857ed70b52636d00b8f763a3038db66303856714552ea0cf03d00139bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.162821 kubelet[2951]: E0905 06:06:25.162127 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35365906b7d3b60cafa3578c825064b0b891cc208b58c933c4044ef4c5cdce42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.165938 kubelet[2951]: E0905 06:06:25.165791 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dccdbf857ed70b52636d00b8f763a3038db66303856714552ea0cf03d00139bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d9487c8bd-xllpf" Sep 5 06:06:25.165938 kubelet[2951]: E0905 06:06:25.165831 2951 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dccdbf857ed70b52636d00b8f763a3038db66303856714552ea0cf03d00139bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d9487c8bd-xllpf" Sep 5 06:06:25.166117 kubelet[2951]: E0905 06:06:25.166086 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35365906b7d3b60cafa3578c825064b0b891cc208b58c933c4044ef4c5cdce42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-mzgr6" Sep 5 06:06:25.166117 kubelet[2951]: E0905 06:06:25.166103 2951 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35365906b7d3b60cafa3578c825064b0b891cc208b58c933c4044ef4c5cdce42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-mzgr6" Sep 5 06:06:25.172284 kubelet[2951]: E0905 06:06:25.172054 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-mzgr6_calico-system(6cf24baf-cb0a-415b-9c6a-8c1d4a98d753)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-mzgr6_calico-system(6cf24baf-cb0a-415b-9c6a-8c1d4a98d753)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"35365906b7d3b60cafa3578c825064b0b891cc208b58c933c4044ef4c5cdce42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-mzgr6" podUID="6cf24baf-cb0a-415b-9c6a-8c1d4a98d753" Sep 5 06:06:25.172640 kubelet[2951]: E0905 06:06:25.162386 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92aa612eb4416bc7fcc9c2fbdfb9446e68ae861943b9ba7f10e7efeb6c1579d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.172640 kubelet[2951]: E0905 06:06:25.172488 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92aa612eb4416bc7fcc9c2fbdfb9446e68ae861943b9ba7f10e7efeb6c1579d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d9487c8bd-m6f9c" Sep 5 06:06:25.172640 kubelet[2951]: E0905 06:06:25.172510 2951 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92aa612eb4416bc7fcc9c2fbdfb9446e68ae861943b9ba7f10e7efeb6c1579d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d9487c8bd-m6f9c" Sep 5 06:06:25.172727 kubelet[2951]: E0905 06:06:25.172540 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d9487c8bd-m6f9c_calico-apiserver(db55fca1-860d-422e-9c2e-072372ad5501)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d9487c8bd-m6f9c_calico-apiserver(db55fca1-860d-422e-9c2e-072372ad5501)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"92aa612eb4416bc7fcc9c2fbdfb9446e68ae861943b9ba7f10e7efeb6c1579d3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d9487c8bd-m6f9c" podUID="db55fca1-860d-422e-9c2e-072372ad5501" Sep 5 06:06:25.172727 kubelet[2951]: E0905 06:06:25.162374 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2663d0f1625065578efa567a85ec7ea376568a92f307f831587b0998974b145\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.172727 kubelet[2951]: E0905 06:06:25.172568 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2663d0f1625065578efa567a85ec7ea376568a92f307f831587b0998974b145\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-k95rr" Sep 5 06:06:25.172831 kubelet[2951]: E0905 06:06:25.172577 2951 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2663d0f1625065578efa567a85ec7ea376568a92f307f831587b0998974b145\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-k95rr" Sep 5 06:06:25.172831 kubelet[2951]: E0905 06:06:25.172592 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-k95rr_kube-system(3484328b-3fb2-41ea-b562-a71f1ba2782c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-k95rr_kube-system(3484328b-3fb2-41ea-b562-a71f1ba2782c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f2663d0f1625065578efa567a85ec7ea376568a92f307f831587b0998974b145\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-k95rr" podUID="3484328b-3fb2-41ea-b562-a71f1ba2782c" Sep 5 06:06:25.172831 kubelet[2951]: E0905 06:06:25.172610 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d9487c8bd-xllpf_calico-apiserver(4946ba55-c57c-47d1-a863-4c7bd324702d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d9487c8bd-xllpf_calico-apiserver(4946ba55-c57c-47d1-a863-4c7bd324702d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dccdbf857ed70b52636d00b8f763a3038db66303856714552ea0cf03d00139bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d9487c8bd-xllpf" podUID="4946ba55-c57c-47d1-a863-4c7bd324702d" Sep 5 06:06:25.173818 kubelet[2951]: E0905 06:06:25.162397 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a41d7db4f6b77b0c39c0ea8c991d170142d70735740ef20110a26b822e6f638f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.173818 kubelet[2951]: E0905 06:06:25.173614 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a41d7db4f6b77b0c39c0ea8c991d170142d70735740ef20110a26b822e6f638f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-rxtvx" Sep 5 06:06:25.173818 kubelet[2951]: E0905 06:06:25.173635 2951 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a41d7db4f6b77b0c39c0ea8c991d170142d70735740ef20110a26b822e6f638f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-rxtvx" Sep 5 06:06:25.173939 kubelet[2951]: E0905 06:06:25.173668 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-rxtvx_kube-system(d6254f86-4e8b-4f74-af07-6b68e974d028)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-rxtvx_kube-system(d6254f86-4e8b-4f74-af07-6b68e974d028)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a41d7db4f6b77b0c39c0ea8c991d170142d70735740ef20110a26b822e6f638f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-rxtvx" podUID="d6254f86-4e8b-4f74-af07-6b68e974d028" Sep 5 06:06:25.173939 kubelet[2951]: E0905 06:06:25.162358 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6f36e8118db30ceec4c3657a7b15ae8f6d650c91cdd8bfd5125f4c731a5c1c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.173939 kubelet[2951]: E0905 06:06:25.173703 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6f36e8118db30ceec4c3657a7b15ae8f6d650c91cdd8bfd5125f4c731a5c1c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cc6595ff6-qlwqn" Sep 5 06:06:25.174093 kubelet[2951]: E0905 06:06:25.173711 2951 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6f36e8118db30ceec4c3657a7b15ae8f6d650c91cdd8bfd5125f4c731a5c1c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cc6595ff6-qlwqn" Sep 5 06:06:25.174093 kubelet[2951]: E0905 06:06:25.173731 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5cc6595ff6-qlwqn_calico-system(0e471d34-64ab-423f-a035-550b2b064a5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5cc6595ff6-qlwqn_calico-system(0e471d34-64ab-423f-a035-550b2b064a5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a6f36e8118db30ceec4c3657a7b15ae8f6d650c91cdd8bfd5125f4c731a5c1c8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5cc6595ff6-qlwqn" podUID="0e471d34-64ab-423f-a035-550b2b064a5c" Sep 5 06:06:25.174093 kubelet[2951]: E0905 06:06:25.162429 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad55259420c9d8a2c29f60b9af0e04bd04d13340d80f70b6ebe55ad8b4c7b8ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.174231 kubelet[2951]: E0905 06:06:25.173764 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad55259420c9d8a2c29f60b9af0e04bd04d13340d80f70b6ebe55ad8b4c7b8ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7656979696-qwpvc" Sep 5 06:06:25.174231 kubelet[2951]: E0905 06:06:25.173777 2951 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad55259420c9d8a2c29f60b9af0e04bd04d13340d80f70b6ebe55ad8b4c7b8ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7656979696-qwpvc" Sep 5 06:06:25.174231 kubelet[2951]: E0905 06:06:25.173793 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7656979696-qwpvc_calico-system(098f5efb-c722-4a99-8289-c0a4a7d28760)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7656979696-qwpvc_calico-system(098f5efb-c722-4a99-8289-c0a4a7d28760)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad55259420c9d8a2c29f60b9af0e04bd04d13340d80f70b6ebe55ad8b4c7b8ef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7656979696-qwpvc" podUID="098f5efb-c722-4a99-8289-c0a4a7d28760" Sep 5 06:06:25.548319 kubelet[2951]: I0905 06:06:25.548125 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:06:25.634446 systemd[1]: Created slice kubepods-besteffort-pod3aca1205_261a_414a_bbc5_de4a4cc8f072.slice - libcontainer container kubepods-besteffort-pod3aca1205_261a_414a_bbc5_de4a4cc8f072.slice. Sep 5 06:06:25.645975 containerd[1653]: time="2025-09-05T06:06:25.645926719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zmpmp,Uid:3aca1205-261a-414a-bbc5-de4a4cc8f072,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:25.686952 containerd[1653]: time="2025-09-05T06:06:25.686918324Z" level=error msg="Failed to destroy network for sandbox \"9d1b60a86aa84d536cf7f2cd1d05a80ed27c71f2b94aa0de23ae10aa0892945a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.688615 systemd[1]: run-netns-cni\x2d806267ad\x2d614e\x2d1378\x2d50c6\x2d813618bbefab.mount: Deactivated successfully. Sep 5 06:06:25.689921 containerd[1653]: time="2025-09-05T06:06:25.689890504Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zmpmp,Uid:3aca1205-261a-414a-bbc5-de4a4cc8f072,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d1b60a86aa84d536cf7f2cd1d05a80ed27c71f2b94aa0de23ae10aa0892945a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.690232 kubelet[2951]: E0905 06:06:25.690041 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d1b60a86aa84d536cf7f2cd1d05a80ed27c71f2b94aa0de23ae10aa0892945a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:06:25.690232 kubelet[2951]: E0905 06:06:25.690081 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d1b60a86aa84d536cf7f2cd1d05a80ed27c71f2b94aa0de23ae10aa0892945a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zmpmp" Sep 5 06:06:25.690232 kubelet[2951]: E0905 06:06:25.690098 2951 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d1b60a86aa84d536cf7f2cd1d05a80ed27c71f2b94aa0de23ae10aa0892945a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zmpmp" Sep 5 06:06:25.690302 kubelet[2951]: E0905 06:06:25.690130 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zmpmp_calico-system(3aca1205-261a-414a-bbc5-de4a4cc8f072)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zmpmp_calico-system(3aca1205-261a-414a-bbc5-de4a4cc8f072)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9d1b60a86aa84d536cf7f2cd1d05a80ed27c71f2b94aa0de23ae10aa0892945a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zmpmp" podUID="3aca1205-261a-414a-bbc5-de4a4cc8f072" Sep 5 06:06:29.759647 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2479014998.mount: Deactivated successfully. Sep 5 06:06:30.348657 containerd[1653]: time="2025-09-05T06:06:30.348621363Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:30.363505 containerd[1653]: time="2025-09-05T06:06:30.360365756Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 5 06:06:30.399816 containerd[1653]: time="2025-09-05T06:06:30.399760095Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:30.429910 containerd[1653]: time="2025-09-05T06:06:30.429849767Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:30.432245 containerd[1653]: time="2025-09-05T06:06:30.432216749Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.642465082s" Sep 5 06:06:30.442032 containerd[1653]: time="2025-09-05T06:06:30.432248010Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 5 06:06:30.502693 containerd[1653]: time="2025-09-05T06:06:30.502664434Z" level=info msg="CreateContainer within sandbox \"2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 5 06:06:30.928381 containerd[1653]: time="2025-09-05T06:06:30.928339222Z" level=info msg="Container e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:30.929036 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1336101429.mount: Deactivated successfully. Sep 5 06:06:31.010736 containerd[1653]: time="2025-09-05T06:06:31.010680463Z" level=info msg="CreateContainer within sandbox \"2d6544f893c76d66a3f9e038f544a9f5fd0deb6694677c70bcd43246b1626caf\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458\"" Sep 5 06:06:31.011654 containerd[1653]: time="2025-09-05T06:06:31.011277110Z" level=info msg="StartContainer for \"e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458\"" Sep 5 06:06:31.019410 containerd[1653]: time="2025-09-05T06:06:31.019384666Z" level=info msg="connecting to shim e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458" address="unix:///run/containerd/s/4574625e4e2e0501fbf203bea626e0bdd9bc550535588f9cb22406f388375f36" protocol=ttrpc version=3 Sep 5 06:06:31.206324 systemd[1]: Started cri-containerd-e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458.scope - libcontainer container e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458. Sep 5 06:06:31.239979 containerd[1653]: time="2025-09-05T06:06:31.239953705Z" level=info msg="StartContainer for \"e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458\" returns successfully" Sep 5 06:06:31.622771 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 5 06:06:31.631160 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 5 06:06:32.047493 containerd[1653]: time="2025-09-05T06:06:32.047420402Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458\" id:\"28dacd5d938e8b3bfa3eb26cfbb1b37bd5e56b2f919e23b802a603a2ca7906c6\" pid:4018 exit_status:1 exited_at:{seconds:1757052392 nanos:46062544}" Sep 5 06:06:32.159944 kubelet[2951]: I0905 06:06:32.159554 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-86ssv" podStartSLOduration=3.022527438 podStartE2EDuration="23.159538275s" podCreationTimestamp="2025-09-05 06:06:09 +0000 UTC" firstStartedPulling="2025-09-05 06:06:10.295846871 +0000 UTC m=+21.809790440" lastFinishedPulling="2025-09-05 06:06:30.432857703 +0000 UTC m=+41.946801277" observedRunningTime="2025-09-05 06:06:31.893872093 +0000 UTC m=+43.407815664" watchObservedRunningTime="2025-09-05 06:06:32.159538275 +0000 UTC m=+43.673481854" Sep 5 06:06:32.287624 kubelet[2951]: I0905 06:06:32.287348 2951 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/098f5efb-c722-4a99-8289-c0a4a7d28760-whisker-ca-bundle\") pod \"098f5efb-c722-4a99-8289-c0a4a7d28760\" (UID: \"098f5efb-c722-4a99-8289-c0a4a7d28760\") " Sep 5 06:06:32.287624 kubelet[2951]: I0905 06:06:32.287373 2951 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9mnk\" (UniqueName: \"kubernetes.io/projected/098f5efb-c722-4a99-8289-c0a4a7d28760-kube-api-access-g9mnk\") pod \"098f5efb-c722-4a99-8289-c0a4a7d28760\" (UID: \"098f5efb-c722-4a99-8289-c0a4a7d28760\") " Sep 5 06:06:32.287624 kubelet[2951]: I0905 06:06:32.287402 2951 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/098f5efb-c722-4a99-8289-c0a4a7d28760-whisker-backend-key-pair\") pod \"098f5efb-c722-4a99-8289-c0a4a7d28760\" (UID: \"098f5efb-c722-4a99-8289-c0a4a7d28760\") " Sep 5 06:06:32.287750 kubelet[2951]: I0905 06:06:32.287696 2951 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/098f5efb-c722-4a99-8289-c0a4a7d28760-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "098f5efb-c722-4a99-8289-c0a4a7d28760" (UID: "098f5efb-c722-4a99-8289-c0a4a7d28760"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 5 06:06:32.293915 systemd[1]: var-lib-kubelet-pods-098f5efb\x2dc722\x2d4a99\x2d8289\x2dc0a4a7d28760-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dg9mnk.mount: Deactivated successfully. Sep 5 06:06:32.295132 kubelet[2951]: I0905 06:06:32.295103 2951 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/098f5efb-c722-4a99-8289-c0a4a7d28760-kube-api-access-g9mnk" (OuterVolumeSpecName: "kube-api-access-g9mnk") pod "098f5efb-c722-4a99-8289-c0a4a7d28760" (UID: "098f5efb-c722-4a99-8289-c0a4a7d28760"). InnerVolumeSpecName "kube-api-access-g9mnk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 5 06:06:32.297941 systemd[1]: var-lib-kubelet-pods-098f5efb\x2dc722\x2d4a99\x2d8289\x2dc0a4a7d28760-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 5 06:06:32.298305 kubelet[2951]: I0905 06:06:32.298277 2951 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/098f5efb-c722-4a99-8289-c0a4a7d28760-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "098f5efb-c722-4a99-8289-c0a4a7d28760" (UID: "098f5efb-c722-4a99-8289-c0a4a7d28760"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 5 06:06:32.388140 kubelet[2951]: I0905 06:06:32.388105 2951 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/098f5efb-c722-4a99-8289-c0a4a7d28760-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 5 06:06:32.388140 kubelet[2951]: I0905 06:06:32.388134 2951 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9mnk\" (UniqueName: \"kubernetes.io/projected/098f5efb-c722-4a99-8289-c0a4a7d28760-kube-api-access-g9mnk\") on node \"localhost\" DevicePath \"\"" Sep 5 06:06:32.388140 kubelet[2951]: I0905 06:06:32.388143 2951 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/098f5efb-c722-4a99-8289-c0a4a7d28760-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 5 06:06:32.630876 systemd[1]: Removed slice kubepods-besteffort-pod098f5efb_c722_4a99_8289_c0a4a7d28760.slice - libcontainer container kubepods-besteffort-pod098f5efb_c722_4a99_8289_c0a4a7d28760.slice. Sep 5 06:06:32.875679 systemd[1]: Created slice kubepods-besteffort-pod66d50048_7a5a_43d3_8120_1690ec3a65e7.slice - libcontainer container kubepods-besteffort-pod66d50048_7a5a_43d3_8120_1690ec3a65e7.slice. Sep 5 06:06:32.908198 containerd[1653]: time="2025-09-05T06:06:32.908046380Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458\" id:\"8ae946a1597bb1d612515df255b37c3562c97c28a11a260bb08165b730a80263\" pid:4062 exit_status:1 exited_at:{seconds:1757052392 nanos:907437735}" Sep 5 06:06:32.991467 kubelet[2951]: I0905 06:06:32.991420 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcct\" (UniqueName: \"kubernetes.io/projected/66d50048-7a5a-43d3-8120-1690ec3a65e7-kube-api-access-rxcct\") pod \"whisker-7dd87f786c-r974q\" (UID: \"66d50048-7a5a-43d3-8120-1690ec3a65e7\") " pod="calico-system/whisker-7dd87f786c-r974q" Sep 5 06:06:32.991467 kubelet[2951]: I0905 06:06:32.991472 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66d50048-7a5a-43d3-8120-1690ec3a65e7-whisker-ca-bundle\") pod \"whisker-7dd87f786c-r974q\" (UID: \"66d50048-7a5a-43d3-8120-1690ec3a65e7\") " pod="calico-system/whisker-7dd87f786c-r974q" Sep 5 06:06:32.991643 kubelet[2951]: I0905 06:06:32.991513 2951 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/66d50048-7a5a-43d3-8120-1690ec3a65e7-whisker-backend-key-pair\") pod \"whisker-7dd87f786c-r974q\" (UID: \"66d50048-7a5a-43d3-8120-1690ec3a65e7\") " pod="calico-system/whisker-7dd87f786c-r974q" Sep 5 06:06:33.179513 containerd[1653]: time="2025-09-05T06:06:33.179368188Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7dd87f786c-r974q,Uid:66d50048-7a5a-43d3-8120-1690ec3a65e7,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:33.713442 systemd-networkd[1316]: cali53d55d9c948: Link UP Sep 5 06:06:33.713600 systemd-networkd[1316]: cali53d55d9c948: Gained carrier Sep 5 06:06:33.723473 containerd[1653]: 2025-09-05 06:06:33.209 [INFO][4079] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 5 06:06:33.723473 containerd[1653]: 2025-09-05 06:06:33.258 [INFO][4079] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--7dd87f786c--r974q-eth0 whisker-7dd87f786c- calico-system 66d50048-7a5a-43d3-8120-1690ec3a65e7 873 0 2025-09-05 06:06:32 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7dd87f786c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-7dd87f786c-r974q eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali53d55d9c948 [] [] }} ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Namespace="calico-system" Pod="whisker-7dd87f786c-r974q" WorkloadEndpoint="localhost-k8s-whisker--7dd87f786c--r974q-" Sep 5 06:06:33.723473 containerd[1653]: 2025-09-05 06:06:33.258 [INFO][4079] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Namespace="calico-system" Pod="whisker-7dd87f786c-r974q" WorkloadEndpoint="localhost-k8s-whisker--7dd87f786c--r974q-eth0" Sep 5 06:06:33.723473 containerd[1653]: 2025-09-05 06:06:33.647 [INFO][4088] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" HandleID="k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Workload="localhost-k8s-whisker--7dd87f786c--r974q-eth0" Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.649 [INFO][4088] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" HandleID="k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Workload="localhost-k8s-whisker--7dd87f786c--r974q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fb90), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-7dd87f786c-r974q", "timestamp":"2025-09-05 06:06:33.64734338 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.649 [INFO][4088] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.649 [INFO][4088] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.650 [INFO][4088] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.678 [INFO][4088] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" host="localhost" Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.685 [INFO][4088] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.687 [INFO][4088] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.688 [INFO][4088] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.689 [INFO][4088] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:33.723632 containerd[1653]: 2025-09-05 06:06:33.689 [INFO][4088] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" host="localhost" Sep 5 06:06:33.725127 containerd[1653]: 2025-09-05 06:06:33.690 [INFO][4088] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3 Sep 5 06:06:33.725127 containerd[1653]: 2025-09-05 06:06:33.692 [INFO][4088] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" host="localhost" Sep 5 06:06:33.725127 containerd[1653]: 2025-09-05 06:06:33.697 [INFO][4088] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" host="localhost" Sep 5 06:06:33.725127 containerd[1653]: 2025-09-05 06:06:33.697 [INFO][4088] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" host="localhost" Sep 5 06:06:33.725127 containerd[1653]: 2025-09-05 06:06:33.697 [INFO][4088] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:06:33.725127 containerd[1653]: 2025-09-05 06:06:33.697 [INFO][4088] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" HandleID="k8s-pod-network.f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Workload="localhost-k8s-whisker--7dd87f786c--r974q-eth0" Sep 5 06:06:33.725272 containerd[1653]: 2025-09-05 06:06:33.699 [INFO][4079] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Namespace="calico-system" Pod="whisker-7dd87f786c-r974q" WorkloadEndpoint="localhost-k8s-whisker--7dd87f786c--r974q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7dd87f786c--r974q-eth0", GenerateName:"whisker-7dd87f786c-", Namespace:"calico-system", SelfLink:"", UID:"66d50048-7a5a-43d3-8120-1690ec3a65e7", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7dd87f786c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-7dd87f786c-r974q", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali53d55d9c948", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:33.725272 containerd[1653]: 2025-09-05 06:06:33.699 [INFO][4079] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Namespace="calico-system" Pod="whisker-7dd87f786c-r974q" WorkloadEndpoint="localhost-k8s-whisker--7dd87f786c--r974q-eth0" Sep 5 06:06:33.726371 containerd[1653]: 2025-09-05 06:06:33.699 [INFO][4079] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali53d55d9c948 ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Namespace="calico-system" Pod="whisker-7dd87f786c-r974q" WorkloadEndpoint="localhost-k8s-whisker--7dd87f786c--r974q-eth0" Sep 5 06:06:33.726371 containerd[1653]: 2025-09-05 06:06:33.711 [INFO][4079] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Namespace="calico-system" Pod="whisker-7dd87f786c-r974q" WorkloadEndpoint="localhost-k8s-whisker--7dd87f786c--r974q-eth0" Sep 5 06:06:33.729235 containerd[1653]: 2025-09-05 06:06:33.712 [INFO][4079] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Namespace="calico-system" Pod="whisker-7dd87f786c-r974q" WorkloadEndpoint="localhost-k8s-whisker--7dd87f786c--r974q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7dd87f786c--r974q-eth0", GenerateName:"whisker-7dd87f786c-", Namespace:"calico-system", SelfLink:"", UID:"66d50048-7a5a-43d3-8120-1690ec3a65e7", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7dd87f786c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3", Pod:"whisker-7dd87f786c-r974q", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali53d55d9c948", MAC:"32:50:74:17:86:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:33.729325 containerd[1653]: 2025-09-05 06:06:33.720 [INFO][4079] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" Namespace="calico-system" Pod="whisker-7dd87f786c-r974q" WorkloadEndpoint="localhost-k8s-whisker--7dd87f786c--r974q-eth0" Sep 5 06:06:33.798912 containerd[1653]: time="2025-09-05T06:06:33.798632477Z" level=info msg="connecting to shim f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3" address="unix:///run/containerd/s/df1fd4122aa1cf3417c837f3f6ab5b9923999b7b8c3749b5cc006f672602a779" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:33.817540 systemd[1]: Started cri-containerd-f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3.scope - libcontainer container f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3. Sep 5 06:06:33.838533 systemd-resolved[1554]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:06:33.881892 containerd[1653]: time="2025-09-05T06:06:33.881806963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7dd87f786c-r974q,Uid:66d50048-7a5a-43d3-8120-1690ec3a65e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3\"" Sep 5 06:06:33.887190 containerd[1653]: time="2025-09-05T06:06:33.887157578Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 5 06:06:33.912105 systemd-networkd[1316]: vxlan.calico: Link UP Sep 5 06:06:33.912325 systemd-networkd[1316]: vxlan.calico: Gained carrier Sep 5 06:06:34.627503 kubelet[2951]: I0905 06:06:34.627472 2951 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="098f5efb-c722-4a99-8289-c0a4a7d28760" path="/var/lib/kubelet/pods/098f5efb-c722-4a99-8289-c0a4a7d28760/volumes" Sep 5 06:06:35.241352 systemd-networkd[1316]: cali53d55d9c948: Gained IPv6LL Sep 5 06:06:35.282807 containerd[1653]: time="2025-09-05T06:06:35.282379708Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:35.282807 containerd[1653]: time="2025-09-05T06:06:35.282718957Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 5 06:06:35.282807 containerd[1653]: time="2025-09-05T06:06:35.282783441Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:35.283832 containerd[1653]: time="2025-09-05T06:06:35.283820141Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:35.284221 containerd[1653]: time="2025-09-05T06:06:35.284200658Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.397013928s" Sep 5 06:06:35.284221 containerd[1653]: time="2025-09-05T06:06:35.284218623Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 5 06:06:35.285451 containerd[1653]: time="2025-09-05T06:06:35.285436414Z" level=info msg="CreateContainer within sandbox \"f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 5 06:06:35.289058 containerd[1653]: time="2025-09-05T06:06:35.289038219Z" level=info msg="Container 55280b5ce97ffda98bc4cacbc9c9812c84729939a11a4d4868971ec4c91219be: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:35.292586 containerd[1653]: time="2025-09-05T06:06:35.292568135Z" level=info msg="CreateContainer within sandbox \"f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"55280b5ce97ffda98bc4cacbc9c9812c84729939a11a4d4868971ec4c91219be\"" Sep 5 06:06:35.293407 containerd[1653]: time="2025-09-05T06:06:35.293263973Z" level=info msg="StartContainer for \"55280b5ce97ffda98bc4cacbc9c9812c84729939a11a4d4868971ec4c91219be\"" Sep 5 06:06:35.293832 containerd[1653]: time="2025-09-05T06:06:35.293816804Z" level=info msg="connecting to shim 55280b5ce97ffda98bc4cacbc9c9812c84729939a11a4d4868971ec4c91219be" address="unix:///run/containerd/s/df1fd4122aa1cf3417c837f3f6ab5b9923999b7b8c3749b5cc006f672602a779" protocol=ttrpc version=3 Sep 5 06:06:35.314258 systemd[1]: Started cri-containerd-55280b5ce97ffda98bc4cacbc9c9812c84729939a11a4d4868971ec4c91219be.scope - libcontainer container 55280b5ce97ffda98bc4cacbc9c9812c84729939a11a4d4868971ec4c91219be. Sep 5 06:06:35.351010 containerd[1653]: time="2025-09-05T06:06:35.350944628Z" level=info msg="StartContainer for \"55280b5ce97ffda98bc4cacbc9c9812c84729939a11a4d4868971ec4c91219be\" returns successfully" Sep 5 06:06:35.352485 containerd[1653]: time="2025-09-05T06:06:35.352466123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 5 06:06:35.817265 systemd-networkd[1316]: vxlan.calico: Gained IPv6LL Sep 5 06:06:36.628346 containerd[1653]: time="2025-09-05T06:06:36.628282570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zmpmp,Uid:3aca1205-261a-414a-bbc5-de4a4cc8f072,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:36.706554 systemd-networkd[1316]: cali932be7afd23: Link UP Sep 5 06:06:36.707657 systemd-networkd[1316]: cali932be7afd23: Gained carrier Sep 5 06:06:36.720605 containerd[1653]: 2025-09-05 06:06:36.656 [INFO][4388] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--zmpmp-eth0 csi-node-driver- calico-system 3aca1205-261a-414a-bbc5-de4a4cc8f072 690 0 2025-09-05 06:06:09 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-zmpmp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali932be7afd23 [] [] }} ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Namespace="calico-system" Pod="csi-node-driver-zmpmp" WorkloadEndpoint="localhost-k8s-csi--node--driver--zmpmp-" Sep 5 06:06:36.720605 containerd[1653]: 2025-09-05 06:06:36.657 [INFO][4388] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Namespace="calico-system" Pod="csi-node-driver-zmpmp" WorkloadEndpoint="localhost-k8s-csi--node--driver--zmpmp-eth0" Sep 5 06:06:36.720605 containerd[1653]: 2025-09-05 06:06:36.678 [INFO][4401] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" HandleID="k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Workload="localhost-k8s-csi--node--driver--zmpmp-eth0" Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.679 [INFO][4401] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" HandleID="k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Workload="localhost-k8s-csi--node--driver--zmpmp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f7b0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-zmpmp", "timestamp":"2025-09-05 06:06:36.678982905 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.679 [INFO][4401] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.679 [INFO][4401] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.679 [INFO][4401] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.683 [INFO][4401] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" host="localhost" Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.686 [INFO][4401] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.690 [INFO][4401] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.691 [INFO][4401] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.692 [INFO][4401] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:36.721037 containerd[1653]: 2025-09-05 06:06:36.692 [INFO][4401] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" host="localhost" Sep 5 06:06:36.721743 containerd[1653]: 2025-09-05 06:06:36.694 [INFO][4401] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0 Sep 5 06:06:36.721743 containerd[1653]: 2025-09-05 06:06:36.697 [INFO][4401] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" host="localhost" Sep 5 06:06:36.721743 containerd[1653]: 2025-09-05 06:06:36.700 [INFO][4401] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" host="localhost" Sep 5 06:06:36.721743 containerd[1653]: 2025-09-05 06:06:36.700 [INFO][4401] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" host="localhost" Sep 5 06:06:36.721743 containerd[1653]: 2025-09-05 06:06:36.700 [INFO][4401] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:06:36.721743 containerd[1653]: 2025-09-05 06:06:36.700 [INFO][4401] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" HandleID="k8s-pod-network.704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Workload="localhost-k8s-csi--node--driver--zmpmp-eth0" Sep 5 06:06:36.721876 containerd[1653]: 2025-09-05 06:06:36.702 [INFO][4388] cni-plugin/k8s.go 418: Populated endpoint ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Namespace="calico-system" Pod="csi-node-driver-zmpmp" WorkloadEndpoint="localhost-k8s-csi--node--driver--zmpmp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--zmpmp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3aca1205-261a-414a-bbc5-de4a4cc8f072", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-zmpmp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali932be7afd23", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:36.721924 containerd[1653]: 2025-09-05 06:06:36.702 [INFO][4388] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Namespace="calico-system" Pod="csi-node-driver-zmpmp" WorkloadEndpoint="localhost-k8s-csi--node--driver--zmpmp-eth0" Sep 5 06:06:36.721924 containerd[1653]: 2025-09-05 06:06:36.702 [INFO][4388] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali932be7afd23 ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Namespace="calico-system" Pod="csi-node-driver-zmpmp" WorkloadEndpoint="localhost-k8s-csi--node--driver--zmpmp-eth0" Sep 5 06:06:36.721924 containerd[1653]: 2025-09-05 06:06:36.708 [INFO][4388] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Namespace="calico-system" Pod="csi-node-driver-zmpmp" WorkloadEndpoint="localhost-k8s-csi--node--driver--zmpmp-eth0" Sep 5 06:06:36.721974 containerd[1653]: 2025-09-05 06:06:36.708 [INFO][4388] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Namespace="calico-system" Pod="csi-node-driver-zmpmp" WorkloadEndpoint="localhost-k8s-csi--node--driver--zmpmp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--zmpmp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3aca1205-261a-414a-bbc5-de4a4cc8f072", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0", Pod:"csi-node-driver-zmpmp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali932be7afd23", MAC:"b2:3b:81:45:f3:53", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:36.722037 containerd[1653]: 2025-09-05 06:06:36.715 [INFO][4388] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" Namespace="calico-system" Pod="csi-node-driver-zmpmp" WorkloadEndpoint="localhost-k8s-csi--node--driver--zmpmp-eth0" Sep 5 06:06:36.736896 containerd[1653]: time="2025-09-05T06:06:36.736379315Z" level=info msg="connecting to shim 704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0" address="unix:///run/containerd/s/41c52fd31dbe5704e8dc51b7d5e70a836d6e8574500f57c7b74f787c99932031" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:36.767294 systemd[1]: Started cri-containerd-704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0.scope - libcontainer container 704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0. Sep 5 06:06:36.776196 systemd-resolved[1554]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:06:36.791753 containerd[1653]: time="2025-09-05T06:06:36.791679878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zmpmp,Uid:3aca1205-261a-414a-bbc5-de4a4cc8f072,Namespace:calico-system,Attempt:0,} returns sandbox id \"704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0\"" Sep 5 06:06:37.498661 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4209103573.mount: Deactivated successfully. Sep 5 06:06:37.508724 containerd[1653]: time="2025-09-05T06:06:37.508694781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:37.509186 containerd[1653]: time="2025-09-05T06:06:37.509143411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 5 06:06:37.510933 containerd[1653]: time="2025-09-05T06:06:37.510902747Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:37.513186 containerd[1653]: time="2025-09-05T06:06:37.512899460Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:37.513736 containerd[1653]: time="2025-09-05T06:06:37.513486081Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.161000534s" Sep 5 06:06:37.513736 containerd[1653]: time="2025-09-05T06:06:37.513506437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 5 06:06:37.514968 containerd[1653]: time="2025-09-05T06:06:37.514954823Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 5 06:06:37.539953 containerd[1653]: time="2025-09-05T06:06:37.539925238Z" level=info msg="CreateContainer within sandbox \"f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 5 06:06:37.550739 containerd[1653]: time="2025-09-05T06:06:37.550274560Z" level=info msg="Container c8569c369eff6b2dcb76aa95085b87d99081750c3b881a81b261f4061fd7da3b: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:37.554562 containerd[1653]: time="2025-09-05T06:06:37.554538476Z" level=info msg="CreateContainer within sandbox \"f6e94cd6acc89a2b7b0108a54d48f4efbf90ccedde9c1609b15e0b015e1102f3\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"c8569c369eff6b2dcb76aa95085b87d99081750c3b881a81b261f4061fd7da3b\"" Sep 5 06:06:37.555810 containerd[1653]: time="2025-09-05T06:06:37.555794112Z" level=info msg="StartContainer for \"c8569c369eff6b2dcb76aa95085b87d99081750c3b881a81b261f4061fd7da3b\"" Sep 5 06:06:37.556547 containerd[1653]: time="2025-09-05T06:06:37.556530459Z" level=info msg="connecting to shim c8569c369eff6b2dcb76aa95085b87d99081750c3b881a81b261f4061fd7da3b" address="unix:///run/containerd/s/df1fd4122aa1cf3417c837f3f6ab5b9923999b7b8c3749b5cc006f672602a779" protocol=ttrpc version=3 Sep 5 06:06:37.576742 systemd[1]: Started cri-containerd-c8569c369eff6b2dcb76aa95085b87d99081750c3b881a81b261f4061fd7da3b.scope - libcontainer container c8569c369eff6b2dcb76aa95085b87d99081750c3b881a81b261f4061fd7da3b. Sep 5 06:06:37.639721 containerd[1653]: time="2025-09-05T06:06:37.639650710Z" level=info msg="StartContainer for \"c8569c369eff6b2dcb76aa95085b87d99081750c3b881a81b261f4061fd7da3b\" returns successfully" Sep 5 06:06:37.643047 containerd[1653]: time="2025-09-05T06:06:37.641610798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d9487c8bd-m6f9c,Uid:db55fca1-860d-422e-9c2e-072372ad5501,Namespace:calico-apiserver,Attempt:0,}" Sep 5 06:06:37.779961 systemd-networkd[1316]: cali6881fffaa5c: Link UP Sep 5 06:06:37.781779 systemd-networkd[1316]: cali6881fffaa5c: Gained carrier Sep 5 06:06:37.794914 containerd[1653]: 2025-09-05 06:06:37.705 [INFO][4500] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0 calico-apiserver-7d9487c8bd- calico-apiserver db55fca1-860d-422e-9c2e-072372ad5501 795 0 2025-09-05 06:06:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d9487c8bd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7d9487c8bd-m6f9c eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6881fffaa5c [] [] }} ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-m6f9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-" Sep 5 06:06:37.794914 containerd[1653]: 2025-09-05 06:06:37.705 [INFO][4500] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-m6f9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" Sep 5 06:06:37.794914 containerd[1653]: 2025-09-05 06:06:37.735 [INFO][4512] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" HandleID="k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Workload="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.735 [INFO][4512] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" HandleID="k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Workload="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7d9487c8bd-m6f9c", "timestamp":"2025-09-05 06:06:37.735688492 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.736 [INFO][4512] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.736 [INFO][4512] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.736 [INFO][4512] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.741 [INFO][4512] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" host="localhost" Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.750 [INFO][4512] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.754 [INFO][4512] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.756 [INFO][4512] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.759 [INFO][4512] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:37.795118 containerd[1653]: 2025-09-05 06:06:37.759 [INFO][4512] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" host="localhost" Sep 5 06:06:37.795948 containerd[1653]: 2025-09-05 06:06:37.761 [INFO][4512] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716 Sep 5 06:06:37.795948 containerd[1653]: 2025-09-05 06:06:37.765 [INFO][4512] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" host="localhost" Sep 5 06:06:37.795948 containerd[1653]: 2025-09-05 06:06:37.769 [INFO][4512] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" host="localhost" Sep 5 06:06:37.795948 containerd[1653]: 2025-09-05 06:06:37.769 [INFO][4512] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" host="localhost" Sep 5 06:06:37.795948 containerd[1653]: 2025-09-05 06:06:37.769 [INFO][4512] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:06:37.795948 containerd[1653]: 2025-09-05 06:06:37.769 [INFO][4512] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" HandleID="k8s-pod-network.01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Workload="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" Sep 5 06:06:37.796493 containerd[1653]: 2025-09-05 06:06:37.772 [INFO][4500] cni-plugin/k8s.go 418: Populated endpoint ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-m6f9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0", GenerateName:"calico-apiserver-7d9487c8bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"db55fca1-860d-422e-9c2e-072372ad5501", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d9487c8bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7d9487c8bd-m6f9c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6881fffaa5c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:37.796554 containerd[1653]: 2025-09-05 06:06:37.773 [INFO][4500] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-m6f9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" Sep 5 06:06:37.796554 containerd[1653]: 2025-09-05 06:06:37.773 [INFO][4500] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6881fffaa5c ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-m6f9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" Sep 5 06:06:37.796554 containerd[1653]: 2025-09-05 06:06:37.783 [INFO][4500] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-m6f9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" Sep 5 06:06:37.796614 containerd[1653]: 2025-09-05 06:06:37.785 [INFO][4500] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-m6f9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0", GenerateName:"calico-apiserver-7d9487c8bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"db55fca1-860d-422e-9c2e-072372ad5501", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d9487c8bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716", Pod:"calico-apiserver-7d9487c8bd-m6f9c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6881fffaa5c", MAC:"d6:16:57:6a:fc:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:37.796653 containerd[1653]: 2025-09-05 06:06:37.792 [INFO][4500] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-m6f9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--m6f9c-eth0" Sep 5 06:06:37.812680 containerd[1653]: time="2025-09-05T06:06:37.812614337Z" level=info msg="connecting to shim 01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716" address="unix:///run/containerd/s/477e1b09673041479fe2edf3afdb758454ec6d8e1af2f2241d60930776db0d82" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:37.845291 systemd[1]: Started cri-containerd-01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716.scope - libcontainer container 01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716. Sep 5 06:06:37.857525 systemd-resolved[1554]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:06:37.897535 containerd[1653]: time="2025-09-05T06:06:37.897419457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d9487c8bd-m6f9c,Uid:db55fca1-860d-422e-9c2e-072372ad5501,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716\"" Sep 5 06:06:38.249303 systemd-networkd[1316]: cali932be7afd23: Gained IPv6LL Sep 5 06:06:38.627594 containerd[1653]: time="2025-09-05T06:06:38.627558597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-k95rr,Uid:3484328b-3fb2-41ea-b562-a71f1ba2782c,Namespace:kube-system,Attempt:0,}" Sep 5 06:06:38.627724 containerd[1653]: time="2025-09-05T06:06:38.627707521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cc6595ff6-qlwqn,Uid:0e471d34-64ab-423f-a035-550b2b064a5c,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:38.628037 containerd[1653]: time="2025-09-05T06:06:38.627925926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-rxtvx,Uid:d6254f86-4e8b-4f74-af07-6b68e974d028,Namespace:kube-system,Attempt:0,}" Sep 5 06:06:38.798357 systemd-networkd[1316]: cali470e55824cb: Link UP Sep 5 06:06:38.798633 systemd-networkd[1316]: cali470e55824cb: Gained carrier Sep 5 06:06:38.828820 containerd[1653]: 2025-09-05 06:06:38.686 [INFO][4585] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0 coredns-668d6bf9bc- kube-system d6254f86-4e8b-4f74-af07-6b68e974d028 792 0 2025-09-05 06:05:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-rxtvx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali470e55824cb [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Namespace="kube-system" Pod="coredns-668d6bf9bc-rxtvx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--rxtvx-" Sep 5 06:06:38.828820 containerd[1653]: 2025-09-05 06:06:38.687 [INFO][4585] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Namespace="kube-system" Pod="coredns-668d6bf9bc-rxtvx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" Sep 5 06:06:38.828820 containerd[1653]: 2025-09-05 06:06:38.720 [INFO][4616] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" HandleID="k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Workload="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.720 [INFO][4616] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" HandleID="k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Workload="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5230), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-rxtvx", "timestamp":"2025-09-05 06:06:38.720212082 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.720 [INFO][4616] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.720 [INFO][4616] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.720 [INFO][4616] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.726 [INFO][4616] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" host="localhost" Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.741 [INFO][4616] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.745 [INFO][4616] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.748 [INFO][4616] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.754 [INFO][4616] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:38.829391 containerd[1653]: 2025-09-05 06:06:38.754 [INFO][4616] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" host="localhost" Sep 5 06:06:38.856495 containerd[1653]: 2025-09-05 06:06:38.755 [INFO][4616] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a Sep 5 06:06:38.856495 containerd[1653]: 2025-09-05 06:06:38.766 [INFO][4616] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" host="localhost" Sep 5 06:06:38.856495 containerd[1653]: 2025-09-05 06:06:38.771 [INFO][4616] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" host="localhost" Sep 5 06:06:38.856495 containerd[1653]: 2025-09-05 06:06:38.772 [INFO][4616] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" host="localhost" Sep 5 06:06:38.856495 containerd[1653]: 2025-09-05 06:06:38.772 [INFO][4616] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:06:38.856495 containerd[1653]: 2025-09-05 06:06:38.772 [INFO][4616] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" HandleID="k8s-pod-network.bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Workload="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" Sep 5 06:06:38.866714 containerd[1653]: 2025-09-05 06:06:38.781 [INFO][4585] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Namespace="kube-system" Pod="coredns-668d6bf9bc-rxtvx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d6254f86-4e8b-4f74-af07-6b68e974d028", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 5, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-rxtvx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali470e55824cb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:38.867516 containerd[1653]: 2025-09-05 06:06:38.791 [INFO][4585] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Namespace="kube-system" Pod="coredns-668d6bf9bc-rxtvx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" Sep 5 06:06:38.867516 containerd[1653]: 2025-09-05 06:06:38.791 [INFO][4585] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali470e55824cb ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Namespace="kube-system" Pod="coredns-668d6bf9bc-rxtvx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" Sep 5 06:06:38.867516 containerd[1653]: 2025-09-05 06:06:38.798 [INFO][4585] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Namespace="kube-system" Pod="coredns-668d6bf9bc-rxtvx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" Sep 5 06:06:38.873787 containerd[1653]: 2025-09-05 06:06:38.798 [INFO][4585] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Namespace="kube-system" Pod="coredns-668d6bf9bc-rxtvx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d6254f86-4e8b-4f74-af07-6b68e974d028", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 5, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a", Pod:"coredns-668d6bf9bc-rxtvx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali470e55824cb", MAC:"e6:d6:e2:eb:22:a4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:38.873787 containerd[1653]: 2025-09-05 06:06:38.825 [INFO][4585] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" Namespace="kube-system" Pod="coredns-668d6bf9bc-rxtvx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--rxtvx-eth0" Sep 5 06:06:38.879246 systemd-networkd[1316]: calid304c133cea: Link UP Sep 5 06:06:38.879989 systemd-networkd[1316]: calid304c133cea: Gained carrier Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.679 [INFO][4573] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0 calico-kube-controllers-5cc6595ff6- calico-system 0e471d34-64ab-423f-a035-550b2b064a5c 800 0 2025-09-05 06:06:10 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5cc6595ff6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-5cc6595ff6-qlwqn eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid304c133cea [] [] }} ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Namespace="calico-system" Pod="calico-kube-controllers-5cc6595ff6-qlwqn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.680 [INFO][4573] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Namespace="calico-system" Pod="calico-kube-controllers-5cc6595ff6-qlwqn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.720 [INFO][4611] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" HandleID="k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Workload="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.720 [INFO][4611] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" HandleID="k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Workload="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f890), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-5cc6595ff6-qlwqn", "timestamp":"2025-09-05 06:06:38.720722586 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.720 [INFO][4611] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.773 [INFO][4611] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.773 [INFO][4611] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.827 [INFO][4611] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.842 [INFO][4611] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.844 [INFO][4611] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.845 [INFO][4611] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.847 [INFO][4611] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.848 [INFO][4611] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.849 [INFO][4611] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3 Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.855 [INFO][4611] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.872 [INFO][4611] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.873 [INFO][4611] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" host="localhost" Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.873 [INFO][4611] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:06:38.891926 containerd[1653]: 2025-09-05 06:06:38.873 [INFO][4611] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" HandleID="k8s-pod-network.2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Workload="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" Sep 5 06:06:38.892364 containerd[1653]: 2025-09-05 06:06:38.875 [INFO][4573] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Namespace="calico-system" Pod="calico-kube-controllers-5cc6595ff6-qlwqn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0", GenerateName:"calico-kube-controllers-5cc6595ff6-", Namespace:"calico-system", SelfLink:"", UID:"0e471d34-64ab-423f-a035-550b2b064a5c", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cc6595ff6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-5cc6595ff6-qlwqn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid304c133cea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:38.892364 containerd[1653]: 2025-09-05 06:06:38.875 [INFO][4573] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Namespace="calico-system" Pod="calico-kube-controllers-5cc6595ff6-qlwqn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" Sep 5 06:06:38.892364 containerd[1653]: 2025-09-05 06:06:38.875 [INFO][4573] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid304c133cea ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Namespace="calico-system" Pod="calico-kube-controllers-5cc6595ff6-qlwqn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" Sep 5 06:06:38.892364 containerd[1653]: 2025-09-05 06:06:38.880 [INFO][4573] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Namespace="calico-system" Pod="calico-kube-controllers-5cc6595ff6-qlwqn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" Sep 5 06:06:38.892364 containerd[1653]: 2025-09-05 06:06:38.880 [INFO][4573] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Namespace="calico-system" Pod="calico-kube-controllers-5cc6595ff6-qlwqn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0", GenerateName:"calico-kube-controllers-5cc6595ff6-", Namespace:"calico-system", SelfLink:"", UID:"0e471d34-64ab-423f-a035-550b2b064a5c", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cc6595ff6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3", Pod:"calico-kube-controllers-5cc6595ff6-qlwqn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid304c133cea", MAC:"02:92:d3:ac:2b:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:38.892364 containerd[1653]: 2025-09-05 06:06:38.890 [INFO][4573] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" Namespace="calico-system" Pod="calico-kube-controllers-5cc6595ff6-qlwqn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cc6595ff6--qlwqn-eth0" Sep 5 06:06:38.934149 kubelet[2951]: I0905 06:06:38.922870 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7dd87f786c-r974q" podStartSLOduration=3.262965713 podStartE2EDuration="6.89442337s" podCreationTimestamp="2025-09-05 06:06:32 +0000 UTC" firstStartedPulling="2025-09-05 06:06:33.883203726 +0000 UTC m=+45.397147294" lastFinishedPulling="2025-09-05 06:06:37.514661382 +0000 UTC m=+49.028604951" observedRunningTime="2025-09-05 06:06:37.888429326 +0000 UTC m=+49.402372895" watchObservedRunningTime="2025-09-05 06:06:38.89442337 +0000 UTC m=+50.408366941" Sep 5 06:06:38.974385 containerd[1653]: time="2025-09-05T06:06:38.974320741Z" level=info msg="connecting to shim bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a" address="unix:///run/containerd/s/7dd7f6afde7fa98428e4ece87d7d87609ec8b147826f4453eefb97e3fcc4c7a9" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:39.001707 containerd[1653]: time="2025-09-05T06:06:39.001316422Z" level=info msg="connecting to shim 2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3" address="unix:///run/containerd/s/0982d2ea762ba91457a14e624a501bc3e9f784e3e0a9de3f894460db00d2fd2f" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:39.001386 systemd-networkd[1316]: cali43d1c128d85: Link UP Sep 5 06:06:39.001524 systemd-networkd[1316]: cali43d1c128d85: Gained carrier Sep 5 06:06:39.023432 systemd[1]: Started cri-containerd-bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a.scope - libcontainer container bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a. Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.690 [INFO][4580] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--k95rr-eth0 coredns-668d6bf9bc- kube-system 3484328b-3fb2-41ea-b562-a71f1ba2782c 801 0 2025-09-05 06:05:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-k95rr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali43d1c128d85 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Namespace="kube-system" Pod="coredns-668d6bf9bc-k95rr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--k95rr-" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.690 [INFO][4580] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Namespace="kube-system" Pod="coredns-668d6bf9bc-k95rr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.764 [INFO][4621] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" HandleID="k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Workload="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.764 [INFO][4621] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" HandleID="k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Workload="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58e0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-k95rr", "timestamp":"2025-09-05 06:06:38.764250949 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.764 [INFO][4621] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.873 [INFO][4621] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.874 [INFO][4621] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.927 [INFO][4621] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.944 [INFO][4621] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.948 [INFO][4621] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.961 [INFO][4621] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.965 [INFO][4621] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.965 [INFO][4621] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.968 [INFO][4621] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07 Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.973 [INFO][4621] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.988 [INFO][4621] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.988 [INFO][4621] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" host="localhost" Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.988 [INFO][4621] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:06:39.035315 containerd[1653]: 2025-09-05 06:06:38.988 [INFO][4621] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" HandleID="k8s-pod-network.c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Workload="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" Sep 5 06:06:39.065288 containerd[1653]: 2025-09-05 06:06:38.989 [INFO][4580] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Namespace="kube-system" Pod="coredns-668d6bf9bc-k95rr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--k95rr-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"3484328b-3fb2-41ea-b562-a71f1ba2782c", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 5, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-k95rr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali43d1c128d85", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:39.065288 containerd[1653]: 2025-09-05 06:06:38.989 [INFO][4580] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Namespace="kube-system" Pod="coredns-668d6bf9bc-k95rr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" Sep 5 06:06:39.065288 containerd[1653]: 2025-09-05 06:06:38.989 [INFO][4580] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43d1c128d85 ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Namespace="kube-system" Pod="coredns-668d6bf9bc-k95rr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" Sep 5 06:06:39.065288 containerd[1653]: 2025-09-05 06:06:39.006 [INFO][4580] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Namespace="kube-system" Pod="coredns-668d6bf9bc-k95rr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" Sep 5 06:06:39.065288 containerd[1653]: 2025-09-05 06:06:39.006 [INFO][4580] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Namespace="kube-system" Pod="coredns-668d6bf9bc-k95rr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--k95rr-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"3484328b-3fb2-41ea-b562-a71f1ba2782c", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 5, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07", Pod:"coredns-668d6bf9bc-k95rr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali43d1c128d85", MAC:"ae:29:c0:40:a2:41", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:39.065288 containerd[1653]: 2025-09-05 06:06:39.029 [INFO][4580] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" Namespace="kube-system" Pod="coredns-668d6bf9bc-k95rr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--k95rr-eth0" Sep 5 06:06:39.036317 systemd[1]: Started cri-containerd-2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3.scope - libcontainer container 2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3. Sep 5 06:06:39.043325 systemd-resolved[1554]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:06:39.061152 systemd-resolved[1554]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:06:39.094985 containerd[1653]: time="2025-09-05T06:06:39.094957192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-rxtvx,Uid:d6254f86-4e8b-4f74-af07-6b68e974d028,Namespace:kube-system,Attempt:0,} returns sandbox id \"bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a\"" Sep 5 06:06:39.097773 containerd[1653]: time="2025-09-05T06:06:39.097752558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cc6595ff6-qlwqn,Uid:0e471d34-64ab-423f-a035-550b2b064a5c,Namespace:calico-system,Attempt:0,} returns sandbox id \"2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3\"" Sep 5 06:06:39.110951 containerd[1653]: time="2025-09-05T06:06:39.110720312Z" level=info msg="connecting to shim c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07" address="unix:///run/containerd/s/e20b0f3700185e703224346c71c52497426a2ab0d76ac33be52a929b1918d4fa" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:39.119711 containerd[1653]: time="2025-09-05T06:06:39.119528638Z" level=info msg="CreateContainer within sandbox \"bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 5 06:06:39.153601 containerd[1653]: time="2025-09-05T06:06:39.153541742Z" level=info msg="Container 7e0a2bb0d3f34946154b49307b97ab5f68fe7f130fee13ba2342900c3c181a57: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:39.155365 systemd[1]: Started cri-containerd-c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07.scope - libcontainer container c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07. Sep 5 06:06:39.169385 containerd[1653]: time="2025-09-05T06:06:39.169292655Z" level=info msg="CreateContainer within sandbox \"bad1011f8e4bee203e338d00a777a8148a85b78b203703ff10b834caee55927a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7e0a2bb0d3f34946154b49307b97ab5f68fe7f130fee13ba2342900c3c181a57\"" Sep 5 06:06:39.170356 containerd[1653]: time="2025-09-05T06:06:39.170336179Z" level=info msg="StartContainer for \"7e0a2bb0d3f34946154b49307b97ab5f68fe7f130fee13ba2342900c3c181a57\"" Sep 5 06:06:39.171116 containerd[1653]: time="2025-09-05T06:06:39.170790786Z" level=info msg="connecting to shim 7e0a2bb0d3f34946154b49307b97ab5f68fe7f130fee13ba2342900c3c181a57" address="unix:///run/containerd/s/7dd7f6afde7fa98428e4ece87d7d87609ec8b147826f4453eefb97e3fcc4c7a9" protocol=ttrpc version=3 Sep 5 06:06:39.177153 systemd-resolved[1554]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:06:39.201394 systemd[1]: Started cri-containerd-7e0a2bb0d3f34946154b49307b97ab5f68fe7f130fee13ba2342900c3c181a57.scope - libcontainer container 7e0a2bb0d3f34946154b49307b97ab5f68fe7f130fee13ba2342900c3c181a57. Sep 5 06:06:39.233019 containerd[1653]: time="2025-09-05T06:06:39.232936923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-k95rr,Uid:3484328b-3fb2-41ea-b562-a71f1ba2782c,Namespace:kube-system,Attempt:0,} returns sandbox id \"c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07\"" Sep 5 06:06:39.235529 containerd[1653]: time="2025-09-05T06:06:39.235490594Z" level=info msg="CreateContainer within sandbox \"c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 5 06:06:39.245514 containerd[1653]: time="2025-09-05T06:06:39.245473077Z" level=info msg="Container ecd4d055b9cb0cb4bb889026391b096df18f5308e958bb4d595c8c023cd17a27: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:39.253933 containerd[1653]: time="2025-09-05T06:06:39.253722477Z" level=info msg="CreateContainer within sandbox \"c4c53df1204ef9c64920a973e50daf515c8e4003a3d34f98930425f51c3dce07\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ecd4d055b9cb0cb4bb889026391b096df18f5308e958bb4d595c8c023cd17a27\"" Sep 5 06:06:39.257188 containerd[1653]: time="2025-09-05T06:06:39.256260335Z" level=info msg="StartContainer for \"7e0a2bb0d3f34946154b49307b97ab5f68fe7f130fee13ba2342900c3c181a57\" returns successfully" Sep 5 06:06:39.258895 containerd[1653]: time="2025-09-05T06:06:39.258860346Z" level=info msg="StartContainer for \"ecd4d055b9cb0cb4bb889026391b096df18f5308e958bb4d595c8c023cd17a27\"" Sep 5 06:06:39.267954 containerd[1653]: time="2025-09-05T06:06:39.267327908Z" level=info msg="connecting to shim ecd4d055b9cb0cb4bb889026391b096df18f5308e958bb4d595c8c023cd17a27" address="unix:///run/containerd/s/e20b0f3700185e703224346c71c52497426a2ab0d76ac33be52a929b1918d4fa" protocol=ttrpc version=3 Sep 5 06:06:39.324450 systemd[1]: Started cri-containerd-ecd4d055b9cb0cb4bb889026391b096df18f5308e958bb4d595c8c023cd17a27.scope - libcontainer container ecd4d055b9cb0cb4bb889026391b096df18f5308e958bb4d595c8c023cd17a27. Sep 5 06:06:39.338099 containerd[1653]: time="2025-09-05T06:06:39.337892366Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:39.338381 systemd-networkd[1316]: cali6881fffaa5c: Gained IPv6LL Sep 5 06:06:39.341840 containerd[1653]: time="2025-09-05T06:06:39.341815808Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:39.348945 containerd[1653]: time="2025-09-05T06:06:39.347991518Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 5 06:06:39.351330 containerd[1653]: time="2025-09-05T06:06:39.351162052Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:39.352060 containerd[1653]: time="2025-09-05T06:06:39.352045105Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.837004461s" Sep 5 06:06:39.352215 containerd[1653]: time="2025-09-05T06:06:39.352196753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 5 06:06:39.374496 containerd[1653]: time="2025-09-05T06:06:39.374468034Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 5 06:06:39.387005 containerd[1653]: time="2025-09-05T06:06:39.386778884Z" level=info msg="CreateContainer within sandbox \"704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 5 06:06:39.396231 containerd[1653]: time="2025-09-05T06:06:39.396198356Z" level=info msg="StartContainer for \"ecd4d055b9cb0cb4bb889026391b096df18f5308e958bb4d595c8c023cd17a27\" returns successfully" Sep 5 06:06:39.404785 containerd[1653]: time="2025-09-05T06:06:39.404637051Z" level=info msg="Container 7f6c68e26391588487c0b26ae3860299e3760ee7f88cba662c262373253d4c8c: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:39.418768 containerd[1653]: time="2025-09-05T06:06:39.418736245Z" level=info msg="CreateContainer within sandbox \"704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"7f6c68e26391588487c0b26ae3860299e3760ee7f88cba662c262373253d4c8c\"" Sep 5 06:06:39.419648 containerd[1653]: time="2025-09-05T06:06:39.419630584Z" level=info msg="StartContainer for \"7f6c68e26391588487c0b26ae3860299e3760ee7f88cba662c262373253d4c8c\"" Sep 5 06:06:39.421724 containerd[1653]: time="2025-09-05T06:06:39.421696767Z" level=info msg="connecting to shim 7f6c68e26391588487c0b26ae3860299e3760ee7f88cba662c262373253d4c8c" address="unix:///run/containerd/s/41c52fd31dbe5704e8dc51b7d5e70a836d6e8574500f57c7b74f787c99932031" protocol=ttrpc version=3 Sep 5 06:06:39.444315 systemd[1]: Started cri-containerd-7f6c68e26391588487c0b26ae3860299e3760ee7f88cba662c262373253d4c8c.scope - libcontainer container 7f6c68e26391588487c0b26ae3860299e3760ee7f88cba662c262373253d4c8c. Sep 5 06:06:39.481378 containerd[1653]: time="2025-09-05T06:06:39.481339182Z" level=info msg="StartContainer for \"7f6c68e26391588487c0b26ae3860299e3760ee7f88cba662c262373253d4c8c\" returns successfully" Sep 5 06:06:39.626035 containerd[1653]: time="2025-09-05T06:06:39.626014702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d9487c8bd-xllpf,Uid:4946ba55-c57c-47d1-a863-4c7bd324702d,Namespace:calico-apiserver,Attempt:0,}" Sep 5 06:06:40.004520 systemd-networkd[1316]: cali646006ee9f5: Link UP Sep 5 06:06:40.005102 systemd-networkd[1316]: cali646006ee9f5: Gained carrier Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.891 [INFO][4901] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0 calico-apiserver-7d9487c8bd- calico-apiserver 4946ba55-c57c-47d1-a863-4c7bd324702d 803 0 2025-09-05 06:06:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d9487c8bd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7d9487c8bd-xllpf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali646006ee9f5 [] [] }} ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-xllpf" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.891 [INFO][4901] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-xllpf" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.908 [INFO][4913] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" HandleID="k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Workload="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.908 [INFO][4913] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" HandleID="k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Workload="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f060), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7d9487c8bd-xllpf", "timestamp":"2025-09-05 06:06:39.908335058 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.908 [INFO][4913] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.908 [INFO][4913] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.908 [INFO][4913] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.931 [INFO][4913] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.965 [INFO][4913] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.970 [INFO][4913] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.972 [INFO][4913] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.982 [INFO][4913] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.983 [INFO][4913] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.984 [INFO][4913] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:39.988 [INFO][4913] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:40.000 [INFO][4913] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:40.000 [INFO][4913] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" host="localhost" Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:40.000 [INFO][4913] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:06:40.025768 containerd[1653]: 2025-09-05 06:06:40.000 [INFO][4913] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" HandleID="k8s-pod-network.983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Workload="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" Sep 5 06:06:40.067237 containerd[1653]: 2025-09-05 06:06:40.002 [INFO][4901] cni-plugin/k8s.go 418: Populated endpoint ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-xllpf" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0", GenerateName:"calico-apiserver-7d9487c8bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"4946ba55-c57c-47d1-a863-4c7bd324702d", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d9487c8bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7d9487c8bd-xllpf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali646006ee9f5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:40.067237 containerd[1653]: 2025-09-05 06:06:40.002 [INFO][4901] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-xllpf" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" Sep 5 06:06:40.067237 containerd[1653]: 2025-09-05 06:06:40.002 [INFO][4901] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali646006ee9f5 ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-xllpf" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" Sep 5 06:06:40.067237 containerd[1653]: 2025-09-05 06:06:40.005 [INFO][4901] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-xllpf" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" Sep 5 06:06:40.067237 containerd[1653]: 2025-09-05 06:06:40.005 [INFO][4901] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-xllpf" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0", GenerateName:"calico-apiserver-7d9487c8bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"4946ba55-c57c-47d1-a863-4c7bd324702d", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d9487c8bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b", Pod:"calico-apiserver-7d9487c8bd-xllpf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali646006ee9f5", MAC:"d2:e6:46:7d:e0:a8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:40.067237 containerd[1653]: 2025-09-05 06:06:40.023 [INFO][4901] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d9487c8bd-xllpf" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d9487c8bd--xllpf-eth0" Sep 5 06:06:40.169580 systemd-networkd[1316]: cali470e55824cb: Gained IPv6LL Sep 5 06:06:40.170573 containerd[1653]: time="2025-09-05T06:06:40.170511836Z" level=info msg="connecting to shim 983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b" address="unix:///run/containerd/s/06a53f7b1e79bad62337c8e25515496a504b3f09442b62282c7f782a01fa87e6" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:40.197276 systemd[1]: Started cri-containerd-983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b.scope - libcontainer container 983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b. Sep 5 06:06:40.213075 systemd-resolved[1554]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:06:40.249511 kubelet[2951]: I0905 06:06:40.249460 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-k95rr" podStartSLOduration=45.249447211 podStartE2EDuration="45.249447211s" podCreationTimestamp="2025-09-05 06:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:06:40.19971243 +0000 UTC m=+51.713656008" watchObservedRunningTime="2025-09-05 06:06:40.249447211 +0000 UTC m=+51.763390789" Sep 5 06:06:40.266059 containerd[1653]: time="2025-09-05T06:06:40.265906480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d9487c8bd-xllpf,Uid:4946ba55-c57c-47d1-a863-4c7bd324702d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b\"" Sep 5 06:06:40.273317 kubelet[2951]: I0905 06:06:40.273270 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-rxtvx" podStartSLOduration=45.273254542 podStartE2EDuration="45.273254542s" podCreationTimestamp="2025-09-05 06:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:06:40.252342604 +0000 UTC m=+51.766286183" watchObservedRunningTime="2025-09-05 06:06:40.273254542 +0000 UTC m=+51.787198115" Sep 5 06:06:40.425412 systemd-networkd[1316]: cali43d1c128d85: Gained IPv6LL Sep 5 06:06:40.635990 containerd[1653]: time="2025-09-05T06:06:40.635957956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mzgr6,Uid:6cf24baf-cb0a-415b-9c6a-8c1d4a98d753,Namespace:calico-system,Attempt:0,}" Sep 5 06:06:40.746877 systemd-networkd[1316]: calid304c133cea: Gained IPv6LL Sep 5 06:06:40.755391 systemd-networkd[1316]: cali3b7d90189b4: Link UP Sep 5 06:06:40.756929 systemd-networkd[1316]: cali3b7d90189b4: Gained carrier Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.699 [INFO][4979] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--mzgr6-eth0 goldmane-54d579b49d- calico-system 6cf24baf-cb0a-415b-9c6a-8c1d4a98d753 798 0 2025-09-05 06:06:09 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-mzgr6 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali3b7d90189b4 [] [] }} ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Namespace="calico-system" Pod="goldmane-54d579b49d-mzgr6" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mzgr6-" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.699 [INFO][4979] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Namespace="calico-system" Pod="goldmane-54d579b49d-mzgr6" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.719 [INFO][4992] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" HandleID="k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Workload="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.720 [INFO][4992] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" HandleID="k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Workload="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd960), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-mzgr6", "timestamp":"2025-09-05 06:06:40.719957917 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.720 [INFO][4992] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.720 [INFO][4992] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.720 [INFO][4992] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.726 [INFO][4992] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.729 [INFO][4992] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.731 [INFO][4992] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.733 [INFO][4992] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.734 [INFO][4992] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.734 [INFO][4992] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.735 [INFO][4992] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.738 [INFO][4992] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.742 [INFO][4992] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.742 [INFO][4992] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" host="localhost" Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.743 [INFO][4992] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:06:40.770536 containerd[1653]: 2025-09-05 06:06:40.743 [INFO][4992] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" HandleID="k8s-pod-network.ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Workload="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" Sep 5 06:06:40.771383 containerd[1653]: 2025-09-05 06:06:40.748 [INFO][4979] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Namespace="calico-system" Pod="goldmane-54d579b49d-mzgr6" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--mzgr6-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"6cf24baf-cb0a-415b-9c6a-8c1d4a98d753", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-mzgr6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3b7d90189b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:40.771383 containerd[1653]: 2025-09-05 06:06:40.748 [INFO][4979] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Namespace="calico-system" Pod="goldmane-54d579b49d-mzgr6" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" Sep 5 06:06:40.771383 containerd[1653]: 2025-09-05 06:06:40.748 [INFO][4979] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3b7d90189b4 ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Namespace="calico-system" Pod="goldmane-54d579b49d-mzgr6" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" Sep 5 06:06:40.771383 containerd[1653]: 2025-09-05 06:06:40.757 [INFO][4979] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Namespace="calico-system" Pod="goldmane-54d579b49d-mzgr6" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" Sep 5 06:06:40.771383 containerd[1653]: 2025-09-05 06:06:40.757 [INFO][4979] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Namespace="calico-system" Pod="goldmane-54d579b49d-mzgr6" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--mzgr6-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"6cf24baf-cb0a-415b-9c6a-8c1d4a98d753", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd", Pod:"goldmane-54d579b49d-mzgr6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3b7d90189b4", MAC:"62:53:61:bf:ab:7f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:06:40.771383 containerd[1653]: 2025-09-05 06:06:40.767 [INFO][4979] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" Namespace="calico-system" Pod="goldmane-54d579b49d-mzgr6" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mzgr6-eth0" Sep 5 06:06:40.788078 containerd[1653]: time="2025-09-05T06:06:40.787762008Z" level=info msg="connecting to shim ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd" address="unix:///run/containerd/s/73108186c18a9e3b4b3c476f729c250e71fbbfc4a06a91bb01b090e6149ecac3" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:06:40.810380 systemd[1]: Started cri-containerd-ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd.scope - libcontainer container ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd. Sep 5 06:06:40.819786 systemd-resolved[1554]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:06:40.847513 containerd[1653]: time="2025-09-05T06:06:40.847474164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mzgr6,Uid:6cf24baf-cb0a-415b-9c6a-8c1d4a98d753,Namespace:calico-system,Attempt:0,} returns sandbox id \"ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd\"" Sep 5 06:06:41.385369 systemd-networkd[1316]: cali646006ee9f5: Gained IPv6LL Sep 5 06:06:42.153295 systemd-networkd[1316]: cali3b7d90189b4: Gained IPv6LL Sep 5 06:06:45.272982 containerd[1653]: time="2025-09-05T06:06:45.272943400Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:45.277641 containerd[1653]: time="2025-09-05T06:06:45.275410437Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 5 06:06:45.280511 containerd[1653]: time="2025-09-05T06:06:45.280422790Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:45.286484 containerd[1653]: time="2025-09-05T06:06:45.286468042Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:45.286787 containerd[1653]: time="2025-09-05T06:06:45.286694286Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 5.910712748s" Sep 5 06:06:45.286787 containerd[1653]: time="2025-09-05T06:06:45.286716518Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 5 06:06:45.287384 containerd[1653]: time="2025-09-05T06:06:45.287369726Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 5 06:06:45.297804 containerd[1653]: time="2025-09-05T06:06:45.297568427Z" level=info msg="CreateContainer within sandbox \"01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 5 06:06:45.320298 containerd[1653]: time="2025-09-05T06:06:45.320274050Z" level=info msg="Container 0c8762a788f1bdc38ec56c90c5e798d4bf03b6c8a0b9d882d183db024aaf0550: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:45.349221 containerd[1653]: time="2025-09-05T06:06:45.349188570Z" level=info msg="CreateContainer within sandbox \"01637da4a55cb070edbf56a36812214641ff281aa8d65aa755abfbd432c63716\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0c8762a788f1bdc38ec56c90c5e798d4bf03b6c8a0b9d882d183db024aaf0550\"" Sep 5 06:06:45.349823 containerd[1653]: time="2025-09-05T06:06:45.349590203Z" level=info msg="StartContainer for \"0c8762a788f1bdc38ec56c90c5e798d4bf03b6c8a0b9d882d183db024aaf0550\"" Sep 5 06:06:45.350416 containerd[1653]: time="2025-09-05T06:06:45.350400467Z" level=info msg="connecting to shim 0c8762a788f1bdc38ec56c90c5e798d4bf03b6c8a0b9d882d183db024aaf0550" address="unix:///run/containerd/s/477e1b09673041479fe2edf3afdb758454ec6d8e1af2f2241d60930776db0d82" protocol=ttrpc version=3 Sep 5 06:06:45.419317 systemd[1]: Started cri-containerd-0c8762a788f1bdc38ec56c90c5e798d4bf03b6c8a0b9d882d183db024aaf0550.scope - libcontainer container 0c8762a788f1bdc38ec56c90c5e798d4bf03b6c8a0b9d882d183db024aaf0550. Sep 5 06:06:45.463065 containerd[1653]: time="2025-09-05T06:06:45.463027510Z" level=info msg="StartContainer for \"0c8762a788f1bdc38ec56c90c5e798d4bf03b6c8a0b9d882d183db024aaf0550\" returns successfully" Sep 5 06:06:46.604839 kubelet[2951]: I0905 06:06:46.604789 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d9487c8bd-m6f9c" podStartSLOduration=32.217220907 podStartE2EDuration="39.604772629s" podCreationTimestamp="2025-09-05 06:06:07 +0000 UTC" firstStartedPulling="2025-09-05 06:06:37.899667656 +0000 UTC m=+49.413611224" lastFinishedPulling="2025-09-05 06:06:45.287219377 +0000 UTC m=+56.801162946" observedRunningTime="2025-09-05 06:06:46.423922275 +0000 UTC m=+57.937865850" watchObservedRunningTime="2025-09-05 06:06:46.604772629 +0000 UTC m=+58.118716211" Sep 5 06:06:49.010150 containerd[1653]: time="2025-09-05T06:06:49.009765046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:49.014446 containerd[1653]: time="2025-09-05T06:06:49.014256227Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 5 06:06:49.016941 containerd[1653]: time="2025-09-05T06:06:49.016919053Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:49.019296 containerd[1653]: time="2025-09-05T06:06:49.019274638Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:49.020475 containerd[1653]: time="2025-09-05T06:06:49.020452571Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.73306655s" Sep 5 06:06:49.020475 containerd[1653]: time="2025-09-05T06:06:49.020473793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 5 06:06:49.056988 containerd[1653]: time="2025-09-05T06:06:49.056799169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 5 06:06:49.108816 containerd[1653]: time="2025-09-05T06:06:49.107700025Z" level=info msg="CreateContainer within sandbox \"2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 5 06:06:49.140885 containerd[1653]: time="2025-09-05T06:06:49.140860621Z" level=info msg="Container 090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:49.165710 containerd[1653]: time="2025-09-05T06:06:49.165686240Z" level=info msg="CreateContainer within sandbox \"2277e0b6753f4326fc8a26498bad577612b971b82436f52b12cfffbdcfa8ffa3\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\"" Sep 5 06:06:49.166290 containerd[1653]: time="2025-09-05T06:06:49.166268283Z" level=info msg="StartContainer for \"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\"" Sep 5 06:06:49.168215 containerd[1653]: time="2025-09-05T06:06:49.168102801Z" level=info msg="connecting to shim 090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66" address="unix:///run/containerd/s/0982d2ea762ba91457a14e624a501bc3e9f784e3e0a9de3f894460db00d2fd2f" protocol=ttrpc version=3 Sep 5 06:06:49.291322 systemd[1]: Started cri-containerd-090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66.scope - libcontainer container 090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66. Sep 5 06:06:49.355197 containerd[1653]: time="2025-09-05T06:06:49.355109195Z" level=info msg="StartContainer for \"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\" returns successfully" Sep 5 06:06:50.510789 containerd[1653]: time="2025-09-05T06:06:50.510751716Z" level=info msg="TaskExit event in podsandbox handler container_id:\"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\" id:\"0591dfb26f2a832a98bd889ba3e1556e82cf872b66878da7b4c84dcc8503b9ee\" pid:5177 exited_at:{seconds:1757052410 nanos:437459996}" Sep 5 06:06:50.743311 kubelet[2951]: I0905 06:06:50.740002 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5cc6595ff6-qlwqn" podStartSLOduration=30.798800877 podStartE2EDuration="40.722925249s" podCreationTimestamp="2025-09-05 06:06:10 +0000 UTC" firstStartedPulling="2025-09-05 06:06:39.113030324 +0000 UTC m=+50.626973895" lastFinishedPulling="2025-09-05 06:06:49.037154696 +0000 UTC m=+60.551098267" observedRunningTime="2025-09-05 06:06:50.378992097 +0000 UTC m=+61.892935675" watchObservedRunningTime="2025-09-05 06:06:50.722925249 +0000 UTC m=+62.236868822" Sep 5 06:06:50.964472 containerd[1653]: time="2025-09-05T06:06:50.964432450Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:50.964997 containerd[1653]: time="2025-09-05T06:06:50.964975402Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 5 06:06:50.965684 containerd[1653]: time="2025-09-05T06:06:50.965641219Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:50.967026 containerd[1653]: time="2025-09-05T06:06:50.966998177Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:50.967561 containerd[1653]: time="2025-09-05T06:06:50.967541838Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.910703099s" Sep 5 06:06:50.968687 containerd[1653]: time="2025-09-05T06:06:50.967561555Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 5 06:06:50.978245 containerd[1653]: time="2025-09-05T06:06:50.978051476Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 5 06:06:50.986451 containerd[1653]: time="2025-09-05T06:06:50.986419548Z" level=info msg="CreateContainer within sandbox \"704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 5 06:06:50.994370 containerd[1653]: time="2025-09-05T06:06:50.994333155Z" level=info msg="Container a0d0b531b1fb483a3ffea2b75605a46e85a611ebe9a182aa853169f4db3bdb4b: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:51.056185 containerd[1653]: time="2025-09-05T06:06:51.056097905Z" level=info msg="CreateContainer within sandbox \"704353a7bcfaa70b71e4776a4c7e039cfd9c64844865ab7d71d364431f5ea4b0\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"a0d0b531b1fb483a3ffea2b75605a46e85a611ebe9a182aa853169f4db3bdb4b\"" Sep 5 06:06:51.197217 containerd[1653]: time="2025-09-05T06:06:51.196786699Z" level=info msg="StartContainer for \"a0d0b531b1fb483a3ffea2b75605a46e85a611ebe9a182aa853169f4db3bdb4b\"" Sep 5 06:06:51.203854 containerd[1653]: time="2025-09-05T06:06:51.203828675Z" level=info msg="connecting to shim a0d0b531b1fb483a3ffea2b75605a46e85a611ebe9a182aa853169f4db3bdb4b" address="unix:///run/containerd/s/41c52fd31dbe5704e8dc51b7d5e70a836d6e8574500f57c7b74f787c99932031" protocol=ttrpc version=3 Sep 5 06:06:51.221344 systemd[1]: Started cri-containerd-a0d0b531b1fb483a3ffea2b75605a46e85a611ebe9a182aa853169f4db3bdb4b.scope - libcontainer container a0d0b531b1fb483a3ffea2b75605a46e85a611ebe9a182aa853169f4db3bdb4b. Sep 5 06:06:51.301706 containerd[1653]: time="2025-09-05T06:06:51.301641042Z" level=info msg="StartContainer for \"a0d0b531b1fb483a3ffea2b75605a46e85a611ebe9a182aa853169f4db3bdb4b\" returns successfully" Sep 5 06:06:51.482699 containerd[1653]: time="2025-09-05T06:06:51.482244574Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:51.483793 containerd[1653]: time="2025-09-05T06:06:51.483768222Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 5 06:06:51.486306 containerd[1653]: time="2025-09-05T06:06:51.486017445Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 507.932699ms" Sep 5 06:06:51.486306 containerd[1653]: time="2025-09-05T06:06:51.486040485Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 5 06:06:51.490158 containerd[1653]: time="2025-09-05T06:06:51.490118940Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 5 06:06:51.492659 containerd[1653]: time="2025-09-05T06:06:51.492629593Z" level=info msg="CreateContainer within sandbox \"983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 5 06:06:51.507182 containerd[1653]: time="2025-09-05T06:06:51.506765284Z" level=info msg="Container ef74339747ddcbe40315616e6f25edf594057afe5bf1acd37221bcdf7f713433: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:51.520904 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1464219544.mount: Deactivated successfully. Sep 5 06:06:51.532700 containerd[1653]: time="2025-09-05T06:06:51.532667926Z" level=info msg="CreateContainer within sandbox \"983c76c4c3c515be3bf1dbce240bddfaa0f5bfff832ef769ee4da55b07948e6b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ef74339747ddcbe40315616e6f25edf594057afe5bf1acd37221bcdf7f713433\"" Sep 5 06:06:51.533971 containerd[1653]: time="2025-09-05T06:06:51.533516761Z" level=info msg="StartContainer for \"ef74339747ddcbe40315616e6f25edf594057afe5bf1acd37221bcdf7f713433\"" Sep 5 06:06:51.534656 containerd[1653]: time="2025-09-05T06:06:51.534629959Z" level=info msg="connecting to shim ef74339747ddcbe40315616e6f25edf594057afe5bf1acd37221bcdf7f713433" address="unix:///run/containerd/s/06a53f7b1e79bad62337c8e25515496a504b3f09442b62282c7f782a01fa87e6" protocol=ttrpc version=3 Sep 5 06:06:51.542567 kubelet[2951]: I0905 06:06:51.542399 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zmpmp" podStartSLOduration=28.357214538 podStartE2EDuration="42.542383597s" podCreationTimestamp="2025-09-05 06:06:09 +0000 UTC" firstStartedPulling="2025-09-05 06:06:36.792500392 +0000 UTC m=+48.306443961" lastFinishedPulling="2025-09-05 06:06:50.977669451 +0000 UTC m=+62.491613020" observedRunningTime="2025-09-05 06:06:51.537443005 +0000 UTC m=+63.051386593" watchObservedRunningTime="2025-09-05 06:06:51.542383597 +0000 UTC m=+63.056327175" Sep 5 06:06:51.557276 systemd[1]: Started cri-containerd-ef74339747ddcbe40315616e6f25edf594057afe5bf1acd37221bcdf7f713433.scope - libcontainer container ef74339747ddcbe40315616e6f25edf594057afe5bf1acd37221bcdf7f713433. Sep 5 06:06:51.603802 containerd[1653]: time="2025-09-05T06:06:51.603007132Z" level=info msg="StartContainer for \"ef74339747ddcbe40315616e6f25edf594057afe5bf1acd37221bcdf7f713433\" returns successfully" Sep 5 06:06:51.897489 kubelet[2951]: I0905 06:06:51.897432 2951 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 5 06:06:51.900494 kubelet[2951]: I0905 06:06:51.900435 2951 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 5 06:06:52.534670 kubelet[2951]: I0905 06:06:52.534517 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d9487c8bd-xllpf" podStartSLOduration=34.319523503 podStartE2EDuration="45.534504008s" podCreationTimestamp="2025-09-05 06:06:07 +0000 UTC" firstStartedPulling="2025-09-05 06:06:40.274229513 +0000 UTC m=+51.788173085" lastFinishedPulling="2025-09-05 06:06:51.489210023 +0000 UTC m=+63.003153590" observedRunningTime="2025-09-05 06:06:52.534011131 +0000 UTC m=+64.047954714" watchObservedRunningTime="2025-09-05 06:06:52.534504008 +0000 UTC m=+64.048447591" Sep 5 06:06:53.593187 kubelet[2951]: I0905 06:06:53.593042 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:06:56.651815 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount910588604.mount: Deactivated successfully. Sep 5 06:06:57.697568 containerd[1653]: time="2025-09-05T06:06:57.689766513Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:57.739384 containerd[1653]: time="2025-09-05T06:06:57.694263944Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 5 06:06:57.762801 containerd[1653]: time="2025-09-05T06:06:57.762759080Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:57.778293 containerd[1653]: time="2025-09-05T06:06:57.778248984Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:06:57.780646 containerd[1653]: time="2025-09-05T06:06:57.780560046Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 6.290418826s" Sep 5 06:06:57.780646 containerd[1653]: time="2025-09-05T06:06:57.780582942Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 5 06:06:57.919908 containerd[1653]: time="2025-09-05T06:06:57.919200788Z" level=info msg="CreateContainer within sandbox \"ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 5 06:06:57.944011 containerd[1653]: time="2025-09-05T06:06:57.943920542Z" level=info msg="Container 26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:06:57.958712 containerd[1653]: time="2025-09-05T06:06:57.958620173Z" level=info msg="CreateContainer within sandbox \"ebb748076829af5dbe0bd778e91ecf58e1e76f6c30d4086a39c1815e459607dd\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414\"" Sep 5 06:06:57.961197 containerd[1653]: time="2025-09-05T06:06:57.960898466Z" level=info msg="StartContainer for \"26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414\"" Sep 5 06:06:57.962409 containerd[1653]: time="2025-09-05T06:06:57.962391255Z" level=info msg="connecting to shim 26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414" address="unix:///run/containerd/s/73108186c18a9e3b4b3c476f729c250e71fbbfc4a06a91bb01b090e6149ecac3" protocol=ttrpc version=3 Sep 5 06:06:57.992327 systemd[1]: Started cri-containerd-26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414.scope - libcontainer container 26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414. Sep 5 06:06:58.085997 containerd[1653]: time="2025-09-05T06:06:58.085957476Z" level=info msg="StartContainer for \"26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414\" returns successfully" Sep 5 06:06:59.792911 kubelet[2951]: I0905 06:06:59.775663 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:06:59.842181 kubelet[2951]: I0905 06:06:59.787492 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-mzgr6" podStartSLOduration=33.654989742 podStartE2EDuration="50.666175665s" podCreationTimestamp="2025-09-05 06:06:09 +0000 UTC" firstStartedPulling="2025-09-05 06:06:40.848280165 +0000 UTC m=+52.362223739" lastFinishedPulling="2025-09-05 06:06:57.859466091 +0000 UTC m=+69.373409662" observedRunningTime="2025-09-05 06:06:59.558684396 +0000 UTC m=+71.072627983" watchObservedRunningTime="2025-09-05 06:06:59.666175665 +0000 UTC m=+71.180119235" Sep 5 06:07:00.517820 containerd[1653]: time="2025-09-05T06:07:00.517773785Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414\" id:\"40dd207554396c5304da0f91e2442b42c4cd92ec26f7ef22620f47e6708bb591\" pid:5332 exited_at:{seconds:1757052420 nanos:512024041}" Sep 5 06:07:03.703199 containerd[1653]: time="2025-09-05T06:07:03.703149402Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458\" id:\"ae9dc429a598a95a0c1e98ca6e44caee79a413e4d35c46140863e56950e92b4a\" pid:5358 exited_at:{seconds:1757052423 nanos:702962264}" Sep 5 06:07:16.525204 containerd[1653]: time="2025-09-05T06:07:16.525156646Z" level=info msg="TaskExit event in podsandbox handler container_id:\"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\" id:\"a002094aaf9a8d110588db05af03d86754fe5665535fa47aa45389e315cf7da9\" pid:5394 exited_at:{seconds:1757052436 nanos:519350822}" Sep 5 06:07:20.555966 containerd[1653]: time="2025-09-05T06:07:20.555932959Z" level=info msg="TaskExit event in podsandbox handler container_id:\"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\" id:\"0f0f838aa02a8a034830ac8cac6a7001265dd18a0c2d9c365c31753ccdb5c0da\" pid:5415 exited_at:{seconds:1757052440 nanos:555650882}" Sep 5 06:07:21.327602 systemd[1]: Started sshd@7-139.178.70.103:22-139.178.89.65:52188.service - OpenSSH per-connection server daemon (139.178.89.65:52188). Sep 5 06:07:21.528098 sshd[5446]: Accepted publickey for core from 139.178.89.65 port 52188 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:21.531067 sshd-session[5446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:21.538343 systemd-logind[1625]: New session 10 of user core. Sep 5 06:07:21.544283 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 5 06:07:22.167298 sshd[5449]: Connection closed by 139.178.89.65 port 52188 Sep 5 06:07:22.167518 sshd-session[5446]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:22.175361 systemd-logind[1625]: Session 10 logged out. Waiting for processes to exit. Sep 5 06:07:22.175739 systemd[1]: sshd@7-139.178.70.103:22-139.178.89.65:52188.service: Deactivated successfully. Sep 5 06:07:22.177669 systemd[1]: session-10.scope: Deactivated successfully. Sep 5 06:07:22.179523 systemd-logind[1625]: Removed session 10. Sep 5 06:07:27.182007 systemd[1]: Started sshd@8-139.178.70.103:22-139.178.89.65:52200.service - OpenSSH per-connection server daemon (139.178.89.65:52200). Sep 5 06:07:27.301097 sshd[5466]: Accepted publickey for core from 139.178.89.65 port 52200 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:27.302813 sshd-session[5466]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:27.307459 systemd-logind[1625]: New session 11 of user core. Sep 5 06:07:27.314315 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 5 06:07:27.631677 sshd[5469]: Connection closed by 139.178.89.65 port 52200 Sep 5 06:07:27.632059 sshd-session[5466]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:27.635420 systemd[1]: sshd@8-139.178.70.103:22-139.178.89.65:52200.service: Deactivated successfully. Sep 5 06:07:27.636956 systemd[1]: session-11.scope: Deactivated successfully. Sep 5 06:07:27.638611 systemd-logind[1625]: Session 11 logged out. Waiting for processes to exit. Sep 5 06:07:27.642342 systemd-logind[1625]: Removed session 11. Sep 5 06:07:28.126813 containerd[1653]: time="2025-09-05T06:07:28.126774351Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414\" id:\"895d2f19275177b8c58d8c7900df8ff04fbcd2db6c12db3f01c8e8e4081313fb\" pid:5489 exited_at:{seconds:1757052448 nanos:126510288}" Sep 5 06:07:30.986005 containerd[1653]: time="2025-09-05T06:07:30.985977541Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414\" id:\"bb87e81c78eadd69ef1d185328a2140c0a14f3fdcda091221a53fc0a3514419c\" pid:5515 exited_at:{seconds:1757052450 nanos:985309586}" Sep 5 06:07:32.640319 systemd[1]: Started sshd@9-139.178.70.103:22-139.178.89.65:45428.service - OpenSSH per-connection server daemon (139.178.89.65:45428). Sep 5 06:07:33.382224 sshd[5527]: Accepted publickey for core from 139.178.89.65 port 45428 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:33.397327 sshd-session[5527]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:33.405753 systemd-logind[1625]: New session 12 of user core. Sep 5 06:07:33.419309 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 5 06:07:34.010888 sshd[5557]: Connection closed by 139.178.89.65 port 45428 Sep 5 06:07:34.013627 sshd-session[5527]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:34.018709 systemd[1]: sshd@9-139.178.70.103:22-139.178.89.65:45428.service: Deactivated successfully. Sep 5 06:07:34.021435 systemd[1]: session-12.scope: Deactivated successfully. Sep 5 06:07:34.040455 systemd-logind[1625]: Session 12 logged out. Waiting for processes to exit. Sep 5 06:07:34.041564 systemd[1]: Started sshd@10-139.178.70.103:22-139.178.89.65:45444.service - OpenSSH per-connection server daemon (139.178.89.65:45444). Sep 5 06:07:34.043739 systemd-logind[1625]: Removed session 12. Sep 5 06:07:34.298229 sshd[5570]: Accepted publickey for core from 139.178.89.65 port 45444 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:34.298209 sshd-session[5570]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:34.308840 systemd-logind[1625]: New session 13 of user core. Sep 5 06:07:34.311294 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 5 06:07:34.612290 containerd[1653]: time="2025-09-05T06:07:34.612242044Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458\" id:\"f079123ac366470c6133608af515e78f9a7b402dce4bea5174a1621f3c38bd6a\" pid:5545 exited_at:{seconds:1757052454 nanos:605389003}" Sep 5 06:07:34.635306 sshd[5573]: Connection closed by 139.178.89.65 port 45444 Sep 5 06:07:34.644111 sshd-session[5570]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:34.655227 systemd[1]: Started sshd@11-139.178.70.103:22-139.178.89.65:45446.service - OpenSSH per-connection server daemon (139.178.89.65:45446). Sep 5 06:07:34.663638 systemd[1]: sshd@10-139.178.70.103:22-139.178.89.65:45444.service: Deactivated successfully. Sep 5 06:07:34.667893 systemd[1]: session-13.scope: Deactivated successfully. Sep 5 06:07:34.673047 systemd-logind[1625]: Session 13 logged out. Waiting for processes to exit. Sep 5 06:07:34.677722 systemd-logind[1625]: Removed session 13. Sep 5 06:07:34.753938 sshd[5580]: Accepted publickey for core from 139.178.89.65 port 45446 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:34.756332 sshd-session[5580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:34.762220 systemd-logind[1625]: New session 14 of user core. Sep 5 06:07:34.768424 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 5 06:07:34.978713 sshd[5586]: Connection closed by 139.178.89.65 port 45446 Sep 5 06:07:34.984156 systemd[1]: sshd@11-139.178.70.103:22-139.178.89.65:45446.service: Deactivated successfully. Sep 5 06:07:34.979670 sshd-session[5580]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:34.985502 systemd[1]: session-14.scope: Deactivated successfully. Sep 5 06:07:34.986076 systemd-logind[1625]: Session 14 logged out. Waiting for processes to exit. Sep 5 06:07:34.987091 systemd-logind[1625]: Removed session 14. Sep 5 06:07:39.992615 systemd[1]: Started sshd@12-139.178.70.103:22-139.178.89.65:59802.service - OpenSSH per-connection server daemon (139.178.89.65:59802). Sep 5 06:07:40.067929 sshd[5607]: Accepted publickey for core from 139.178.89.65 port 59802 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:40.068936 sshd-session[5607]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:40.072297 systemd-logind[1625]: New session 15 of user core. Sep 5 06:07:40.074274 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 5 06:07:40.240538 sshd[5610]: Connection closed by 139.178.89.65 port 59802 Sep 5 06:07:40.240684 sshd-session[5607]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:40.243984 systemd[1]: sshd@12-139.178.70.103:22-139.178.89.65:59802.service: Deactivated successfully. Sep 5 06:07:40.247989 systemd[1]: session-15.scope: Deactivated successfully. Sep 5 06:07:40.250381 systemd-logind[1625]: Session 15 logged out. Waiting for processes to exit. Sep 5 06:07:40.251450 systemd-logind[1625]: Removed session 15. Sep 5 06:07:45.250351 systemd[1]: Started sshd@13-139.178.70.103:22-139.178.89.65:59814.service - OpenSSH per-connection server daemon (139.178.89.65:59814). Sep 5 06:07:45.398050 sshd[5622]: Accepted publickey for core from 139.178.89.65 port 59814 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:45.399942 sshd-session[5622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:45.405379 systemd-logind[1625]: New session 16 of user core. Sep 5 06:07:45.410357 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 5 06:07:45.847737 sshd[5625]: Connection closed by 139.178.89.65 port 59814 Sep 5 06:07:45.846859 sshd-session[5622]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:45.850922 systemd[1]: sshd@13-139.178.70.103:22-139.178.89.65:59814.service: Deactivated successfully. Sep 5 06:07:45.853070 systemd[1]: session-16.scope: Deactivated successfully. Sep 5 06:07:45.854027 systemd-logind[1625]: Session 16 logged out. Waiting for processes to exit. Sep 5 06:07:45.856150 systemd-logind[1625]: Removed session 16. Sep 5 06:07:50.857126 systemd[1]: Started sshd@14-139.178.70.103:22-139.178.89.65:43876.service - OpenSSH per-connection server daemon (139.178.89.65:43876). Sep 5 06:07:50.974459 containerd[1653]: time="2025-09-05T06:07:50.974423516Z" level=info msg="TaskExit event in podsandbox handler container_id:\"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\" id:\"dd7a3df478a82ede22896fc836d82d4853c5b8e1ab4bf722b3a5ad604eb8b4d0\" pid:5650 exited_at:{seconds:1757052470 nanos:973414108}" Sep 5 06:07:51.111257 sshd[5658]: Accepted publickey for core from 139.178.89.65 port 43876 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:51.122041 sshd-session[5658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:51.125505 systemd-logind[1625]: New session 17 of user core. Sep 5 06:07:51.130282 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 5 06:07:54.092020 sshd[5664]: Connection closed by 139.178.89.65 port 43876 Sep 5 06:07:54.095024 sshd-session[5658]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:54.105441 systemd[1]: sshd@14-139.178.70.103:22-139.178.89.65:43876.service: Deactivated successfully. Sep 5 06:07:54.108162 systemd[1]: session-17.scope: Deactivated successfully. Sep 5 06:07:54.110127 systemd-logind[1625]: Session 17 logged out. Waiting for processes to exit. Sep 5 06:07:54.113343 systemd-logind[1625]: Removed session 17. Sep 5 06:07:59.109848 systemd[1]: Started sshd@15-139.178.70.103:22-139.178.89.65:43880.service - OpenSSH per-connection server daemon (139.178.89.65:43880). Sep 5 06:07:59.304581 sshd[5688]: Accepted publickey for core from 139.178.89.65 port 43880 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:59.305576 sshd-session[5688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:59.308931 systemd-logind[1625]: New session 18 of user core. Sep 5 06:07:59.316273 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 5 06:07:59.845519 sshd[5691]: Connection closed by 139.178.89.65 port 43880 Sep 5 06:07:59.845720 sshd-session[5688]: pam_unix(sshd:session): session closed for user core Sep 5 06:07:59.853483 systemd[1]: sshd@15-139.178.70.103:22-139.178.89.65:43880.service: Deactivated successfully. Sep 5 06:07:59.855485 systemd[1]: session-18.scope: Deactivated successfully. Sep 5 06:07:59.857606 systemd-logind[1625]: Session 18 logged out. Waiting for processes to exit. Sep 5 06:07:59.862327 systemd[1]: Started sshd@16-139.178.70.103:22-139.178.89.65:43888.service - OpenSSH per-connection server daemon (139.178.89.65:43888). Sep 5 06:07:59.863223 systemd-logind[1625]: Removed session 18. Sep 5 06:07:59.922806 sshd[5703]: Accepted publickey for core from 139.178.89.65 port 43888 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:07:59.923752 sshd-session[5703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:07:59.927900 systemd-logind[1625]: New session 19 of user core. Sep 5 06:07:59.931276 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 5 06:08:00.750425 sshd[5706]: Connection closed by 139.178.89.65 port 43888 Sep 5 06:08:00.751622 sshd-session[5703]: pam_unix(sshd:session): session closed for user core Sep 5 06:08:00.759781 systemd[1]: sshd@16-139.178.70.103:22-139.178.89.65:43888.service: Deactivated successfully. Sep 5 06:08:00.762077 systemd[1]: session-19.scope: Deactivated successfully. Sep 5 06:08:00.763021 systemd-logind[1625]: Session 19 logged out. Waiting for processes to exit. Sep 5 06:08:00.765660 systemd[1]: Started sshd@17-139.178.70.103:22-139.178.89.65:39352.service - OpenSSH per-connection server daemon (139.178.89.65:39352). Sep 5 06:08:00.768119 systemd-logind[1625]: Removed session 19. Sep 5 06:08:00.884404 sshd[5738]: Accepted publickey for core from 139.178.89.65 port 39352 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:08:00.885827 sshd-session[5738]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:08:00.890305 systemd-logind[1625]: New session 20 of user core. Sep 5 06:08:00.893337 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 5 06:08:00.971141 containerd[1653]: time="2025-09-05T06:08:00.966023503Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26e018c6431f573c85bb2852b91eca1bdd5014e61dc61177d0b2d071895ad414\" id:\"2c32328bcf7b29c9417e9033fff13a2cd9f883035eaa58a4b3ac61498f08e709\" pid:5723 exited_at:{seconds:1757052480 nanos:913756418}" Sep 5 06:08:01.742393 sshd[5741]: Connection closed by 139.178.89.65 port 39352 Sep 5 06:08:01.758781 sshd-session[5738]: pam_unix(sshd:session): session closed for user core Sep 5 06:08:01.786621 systemd[1]: Started sshd@18-139.178.70.103:22-139.178.89.65:39354.service - OpenSSH per-connection server daemon (139.178.89.65:39354). Sep 5 06:08:01.788863 systemd[1]: sshd@17-139.178.70.103:22-139.178.89.65:39352.service: Deactivated successfully. Sep 5 06:08:01.794606 systemd[1]: session-20.scope: Deactivated successfully. Sep 5 06:08:01.806198 systemd-logind[1625]: Session 20 logged out. Waiting for processes to exit. Sep 5 06:08:01.815589 systemd-logind[1625]: Removed session 20. Sep 5 06:08:01.911949 sshd[5755]: Accepted publickey for core from 139.178.89.65 port 39354 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:08:01.913737 sshd-session[5755]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:08:01.918292 systemd-logind[1625]: New session 21 of user core. Sep 5 06:08:01.922391 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 5 06:08:02.778600 sshd[5765]: Connection closed by 139.178.89.65 port 39354 Sep 5 06:08:02.779087 sshd-session[5755]: pam_unix(sshd:session): session closed for user core Sep 5 06:08:02.789609 systemd[1]: sshd@18-139.178.70.103:22-139.178.89.65:39354.service: Deactivated successfully. Sep 5 06:08:02.792538 systemd[1]: session-21.scope: Deactivated successfully. Sep 5 06:08:02.793723 systemd-logind[1625]: Session 21 logged out. Waiting for processes to exit. Sep 5 06:08:02.796115 systemd-logind[1625]: Removed session 21. Sep 5 06:08:02.798677 systemd[1]: Started sshd@19-139.178.70.103:22-139.178.89.65:39366.service - OpenSSH per-connection server daemon (139.178.89.65:39366). Sep 5 06:08:02.930032 sshd[5775]: Accepted publickey for core from 139.178.89.65 port 39366 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:08:02.933367 sshd-session[5775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:08:02.941786 systemd-logind[1625]: New session 22 of user core. Sep 5 06:08:02.949341 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 5 06:08:04.203442 sshd[5784]: Connection closed by 139.178.89.65 port 39366 Sep 5 06:08:04.204734 sshd-session[5775]: pam_unix(sshd:session): session closed for user core Sep 5 06:08:04.211279 systemd-logind[1625]: Session 22 logged out. Waiting for processes to exit. Sep 5 06:08:04.212593 systemd[1]: sshd@19-139.178.70.103:22-139.178.89.65:39366.service: Deactivated successfully. Sep 5 06:08:04.219396 systemd[1]: session-22.scope: Deactivated successfully. Sep 5 06:08:04.228359 systemd-logind[1625]: Removed session 22. Sep 5 06:08:04.578925 containerd[1653]: time="2025-09-05T06:08:04.578621097Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0c9fe66cc2185b34ee04a2381b1ab17ba9c5f8a8a6b8002ec928431171b3458\" id:\"50ddabf4031de3393e565f49370a4ab4aacdd44b54b2f3eeff9695f2c90f8e36\" pid:5791 exited_at:{seconds:1757052484 nanos:558907580}" Sep 5 06:08:09.212969 systemd[1]: Started sshd@20-139.178.70.103:22-139.178.89.65:39378.service - OpenSSH per-connection server daemon (139.178.89.65:39378). Sep 5 06:08:09.362666 sshd[5840]: Accepted publickey for core from 139.178.89.65 port 39378 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:08:09.367729 sshd-session[5840]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:08:09.374297 systemd-logind[1625]: New session 23 of user core. Sep 5 06:08:09.381789 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 5 06:08:09.924917 sshd[5843]: Connection closed by 139.178.89.65 port 39378 Sep 5 06:08:09.925357 sshd-session[5840]: pam_unix(sshd:session): session closed for user core Sep 5 06:08:09.934530 systemd-logind[1625]: Session 23 logged out. Waiting for processes to exit. Sep 5 06:08:09.934718 systemd[1]: sshd@20-139.178.70.103:22-139.178.89.65:39378.service: Deactivated successfully. Sep 5 06:08:09.937335 systemd[1]: session-23.scope: Deactivated successfully. Sep 5 06:08:09.940722 systemd-logind[1625]: Removed session 23. Sep 5 06:08:14.958959 systemd[1]: Started sshd@21-139.178.70.103:22-139.178.89.65:44136.service - OpenSSH per-connection server daemon (139.178.89.65:44136). Sep 5 06:08:15.136621 sshd[5865]: Accepted publickey for core from 139.178.89.65 port 44136 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:08:15.137817 sshd-session[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:08:15.140457 systemd-logind[1625]: New session 24 of user core. Sep 5 06:08:15.145240 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 5 06:08:15.302894 sshd[5868]: Connection closed by 139.178.89.65 port 44136 Sep 5 06:08:15.302795 sshd-session[5865]: pam_unix(sshd:session): session closed for user core Sep 5 06:08:15.306199 systemd-logind[1625]: Session 24 logged out. Waiting for processes to exit. Sep 5 06:08:15.306315 systemd[1]: sshd@21-139.178.70.103:22-139.178.89.65:44136.service: Deactivated successfully. Sep 5 06:08:15.307601 systemd[1]: session-24.scope: Deactivated successfully. Sep 5 06:08:15.308483 systemd-logind[1625]: Removed session 24. Sep 5 06:08:16.715321 containerd[1653]: time="2025-09-05T06:08:16.715276192Z" level=info msg="TaskExit event in podsandbox handler container_id:\"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\" id:\"6e26f412123d742e10409b932368693e90e984890cc53e7d57ba502d1a677c00\" pid:5891 exited_at:{seconds:1757052496 nanos:713638973}" Sep 5 06:08:20.315377 systemd[1]: Started sshd@22-139.178.70.103:22-139.178.89.65:32940.service - OpenSSH per-connection server daemon (139.178.89.65:32940). Sep 5 06:08:20.427022 containerd[1653]: time="2025-09-05T06:08:20.426998612Z" level=info msg="TaskExit event in podsandbox handler container_id:\"090113e890c708814e3834d0d92ddf22eaeda110d8d2e21873e5041b50aa3a66\" id:\"77423cdbcd02649bb07407ba79fbebb67b13176cc5dc67dc3a5366e4d700af2e\" pid:5914 exited_at:{seconds:1757052500 nanos:426547518}" Sep 5 06:08:20.475381 sshd[5901]: Accepted publickey for core from 139.178.89.65 port 32940 ssh2: RSA SHA256:lOVrTMusnG1HPVwly0GqcQwnv433awSMb53jBwYWH18 Sep 5 06:08:20.478735 sshd-session[5901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:08:20.483776 systemd-logind[1625]: New session 25 of user core. Sep 5 06:08:20.487253 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 5 06:08:21.036393 sshd[5925]: Connection closed by 139.178.89.65 port 32940 Sep 5 06:08:21.036756 sshd-session[5901]: pam_unix(sshd:session): session closed for user core Sep 5 06:08:21.039193 systemd-logind[1625]: Session 25 logged out. Waiting for processes to exit. Sep 5 06:08:21.040300 systemd[1]: sshd@22-139.178.70.103:22-139.178.89.65:32940.service: Deactivated successfully. Sep 5 06:08:21.041784 systemd[1]: session-25.scope: Deactivated successfully. Sep 5 06:08:21.043433 systemd-logind[1625]: Removed session 25.