Sep 12 17:53:56.705590 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 15:34:39 -00 2025 Sep 12 17:53:56.705606 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:53:56.705612 kernel: Disabled fast string operations Sep 12 17:53:56.705617 kernel: BIOS-provided physical RAM map: Sep 12 17:53:56.705621 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Sep 12 17:53:56.705625 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Sep 12 17:53:56.705632 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Sep 12 17:53:56.705636 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Sep 12 17:53:56.705641 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Sep 12 17:53:56.705645 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Sep 12 17:53:56.705650 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Sep 12 17:53:56.705654 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Sep 12 17:53:56.705658 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Sep 12 17:53:56.705663 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 12 17:53:56.705669 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Sep 12 17:53:56.705675 kernel: NX (Execute Disable) protection: active Sep 12 17:53:56.705679 kernel: APIC: Static calls initialized Sep 12 17:53:56.705684 kernel: SMBIOS 2.7 present. Sep 12 17:53:56.705690 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Sep 12 17:53:56.705695 kernel: DMI: Memory slots populated: 1/128 Sep 12 17:53:56.705701 kernel: vmware: hypercall mode: 0x00 Sep 12 17:53:56.705706 kernel: Hypervisor detected: VMware Sep 12 17:53:56.705711 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Sep 12 17:53:56.705716 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Sep 12 17:53:56.705721 kernel: vmware: using clock offset of 3205635148 ns Sep 12 17:53:56.705726 kernel: tsc: Detected 3408.000 MHz processor Sep 12 17:53:56.705731 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 17:53:56.705737 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 17:53:56.705742 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Sep 12 17:53:56.705747 kernel: total RAM covered: 3072M Sep 12 17:53:56.705753 kernel: Found optimal setting for mtrr clean up Sep 12 17:53:56.705760 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Sep 12 17:53:56.705765 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Sep 12 17:53:56.705770 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 17:53:56.705776 kernel: Using GB pages for direct mapping Sep 12 17:53:56.705781 kernel: ACPI: Early table checksum verification disabled Sep 12 17:53:56.705786 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Sep 12 17:53:56.705791 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Sep 12 17:53:56.705796 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Sep 12 17:53:56.705803 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Sep 12 17:53:56.705810 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 12 17:53:56.705815 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 12 17:53:56.705820 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Sep 12 17:53:56.705826 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Sep 12 17:53:56.705831 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Sep 12 17:53:56.705838 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Sep 12 17:53:56.705843 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Sep 12 17:53:56.705849 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Sep 12 17:53:56.705854 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Sep 12 17:53:56.705860 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Sep 12 17:53:56.705865 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 12 17:53:56.705870 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 12 17:53:56.705875 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Sep 12 17:53:56.705881 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Sep 12 17:53:56.705887 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Sep 12 17:53:56.705893 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Sep 12 17:53:56.705898 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Sep 12 17:53:56.705903 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Sep 12 17:53:56.705909 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 12 17:53:56.705914 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 12 17:53:56.705919 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Sep 12 17:53:56.705924 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Sep 12 17:53:56.705930 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Sep 12 17:53:56.705936 kernel: Zone ranges: Sep 12 17:53:56.705942 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 17:53:56.705947 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Sep 12 17:53:56.705952 kernel: Normal empty Sep 12 17:53:56.705958 kernel: Device empty Sep 12 17:53:56.705963 kernel: Movable zone start for each node Sep 12 17:53:56.705968 kernel: Early memory node ranges Sep 12 17:53:56.705974 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Sep 12 17:53:56.705979 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Sep 12 17:53:56.705984 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Sep 12 17:53:56.705991 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Sep 12 17:53:56.705996 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 17:53:56.706001 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Sep 12 17:53:56.706007 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Sep 12 17:53:56.706012 kernel: ACPI: PM-Timer IO Port: 0x1008 Sep 12 17:53:56.706017 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Sep 12 17:53:56.706023 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 12 17:53:56.706028 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 12 17:53:56.706033 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 12 17:53:56.706052 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 12 17:53:56.706059 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 12 17:53:56.706065 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 12 17:53:56.706070 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 12 17:53:56.706075 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 12 17:53:56.706080 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 12 17:53:56.706085 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 12 17:53:56.706091 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 12 17:53:56.706096 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 12 17:53:56.706101 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 12 17:53:56.706108 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 12 17:53:56.706113 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 12 17:53:56.706118 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 12 17:53:56.706123 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Sep 12 17:53:56.706129 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Sep 12 17:53:56.706134 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Sep 12 17:53:56.706139 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Sep 12 17:53:56.706145 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Sep 12 17:53:56.706150 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Sep 12 17:53:56.706155 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Sep 12 17:53:56.706162 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Sep 12 17:53:56.706167 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Sep 12 17:53:56.706173 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Sep 12 17:53:56.706178 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Sep 12 17:53:56.706183 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Sep 12 17:53:56.706188 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Sep 12 17:53:56.706194 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Sep 12 17:53:56.706199 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Sep 12 17:53:56.706204 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Sep 12 17:53:56.706209 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Sep 12 17:53:56.706216 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Sep 12 17:53:56.706221 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Sep 12 17:53:56.706226 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Sep 12 17:53:56.706232 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Sep 12 17:53:56.706237 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Sep 12 17:53:56.706242 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Sep 12 17:53:56.706251 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Sep 12 17:53:56.706258 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Sep 12 17:53:56.706263 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Sep 12 17:53:56.706270 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Sep 12 17:53:56.706275 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Sep 12 17:53:56.706281 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Sep 12 17:53:56.706286 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Sep 12 17:53:56.706292 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Sep 12 17:53:56.706298 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Sep 12 17:53:56.706303 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Sep 12 17:53:56.706309 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Sep 12 17:53:56.706314 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Sep 12 17:53:56.706321 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Sep 12 17:53:56.706327 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Sep 12 17:53:56.706332 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Sep 12 17:53:56.706338 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Sep 12 17:53:56.706344 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Sep 12 17:53:56.706349 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Sep 12 17:53:56.706355 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Sep 12 17:53:56.706360 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Sep 12 17:53:56.706366 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Sep 12 17:53:56.706372 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Sep 12 17:53:56.706378 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Sep 12 17:53:56.706384 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Sep 12 17:53:56.706390 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Sep 12 17:53:56.706396 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Sep 12 17:53:56.706401 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Sep 12 17:53:56.706407 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Sep 12 17:53:56.706413 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Sep 12 17:53:56.706418 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Sep 12 17:53:56.706424 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Sep 12 17:53:56.706430 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Sep 12 17:53:56.706436 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Sep 12 17:53:56.706442 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Sep 12 17:53:56.706447 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Sep 12 17:53:56.706453 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Sep 12 17:53:56.706459 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Sep 12 17:53:56.706465 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Sep 12 17:53:56.706470 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Sep 12 17:53:56.706476 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Sep 12 17:53:56.706481 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Sep 12 17:53:56.706488 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Sep 12 17:53:56.706494 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Sep 12 17:53:56.706499 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Sep 12 17:53:56.706505 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Sep 12 17:53:56.706510 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Sep 12 17:53:56.706516 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Sep 12 17:53:56.706522 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Sep 12 17:53:56.706527 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Sep 12 17:53:56.706533 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Sep 12 17:53:56.706539 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Sep 12 17:53:56.706545 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Sep 12 17:53:56.706550 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Sep 12 17:53:56.706556 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Sep 12 17:53:56.706562 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Sep 12 17:53:56.706567 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Sep 12 17:53:56.706573 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Sep 12 17:53:56.706578 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Sep 12 17:53:56.706584 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Sep 12 17:53:56.706590 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Sep 12 17:53:56.706595 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Sep 12 17:53:56.706602 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Sep 12 17:53:56.706608 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Sep 12 17:53:56.706613 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Sep 12 17:53:56.706619 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Sep 12 17:53:56.706625 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Sep 12 17:53:56.706630 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Sep 12 17:53:56.706636 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Sep 12 17:53:56.706641 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Sep 12 17:53:56.706647 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Sep 12 17:53:56.706653 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Sep 12 17:53:56.706659 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Sep 12 17:53:56.706665 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Sep 12 17:53:56.706670 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Sep 12 17:53:56.706676 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Sep 12 17:53:56.706682 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Sep 12 17:53:56.706687 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Sep 12 17:53:56.706693 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Sep 12 17:53:56.706698 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Sep 12 17:53:56.706704 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Sep 12 17:53:56.706709 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Sep 12 17:53:56.706716 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Sep 12 17:53:56.706721 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Sep 12 17:53:56.706727 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Sep 12 17:53:56.706733 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Sep 12 17:53:56.706738 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Sep 12 17:53:56.706744 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Sep 12 17:53:56.706749 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Sep 12 17:53:56.706755 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Sep 12 17:53:56.706760 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Sep 12 17:53:56.706766 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 17:53:56.706773 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Sep 12 17:53:56.706779 kernel: TSC deadline timer available Sep 12 17:53:56.706784 kernel: CPU topo: Max. logical packages: 128 Sep 12 17:53:56.706790 kernel: CPU topo: Max. logical dies: 128 Sep 12 17:53:56.706796 kernel: CPU topo: Max. dies per package: 1 Sep 12 17:53:56.706801 kernel: CPU topo: Max. threads per core: 1 Sep 12 17:53:56.706807 kernel: CPU topo: Num. cores per package: 1 Sep 12 17:53:56.706813 kernel: CPU topo: Num. threads per package: 1 Sep 12 17:53:56.706818 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Sep 12 17:53:56.706824 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Sep 12 17:53:56.706830 kernel: Booting paravirtualized kernel on VMware hypervisor Sep 12 17:53:56.706836 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 17:53:56.706842 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Sep 12 17:53:56.706848 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 12 17:53:56.706854 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 12 17:53:56.706860 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Sep 12 17:53:56.706865 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Sep 12 17:53:56.706871 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Sep 12 17:53:56.706878 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Sep 12 17:53:56.706884 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Sep 12 17:53:56.706890 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Sep 12 17:53:56.706895 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Sep 12 17:53:56.706901 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Sep 12 17:53:56.706907 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Sep 12 17:53:56.706912 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Sep 12 17:53:56.706917 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Sep 12 17:53:56.706923 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Sep 12 17:53:56.706930 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Sep 12 17:53:56.706935 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Sep 12 17:53:56.706941 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Sep 12 17:53:56.706946 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Sep 12 17:53:56.706953 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:53:56.706959 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:53:56.706965 kernel: random: crng init done Sep 12 17:53:56.706970 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Sep 12 17:53:56.706977 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Sep 12 17:53:56.706983 kernel: printk: log_buf_len min size: 262144 bytes Sep 12 17:53:56.706989 kernel: printk: log_buf_len: 1048576 bytes Sep 12 17:53:56.706994 kernel: printk: early log buf free: 245592(93%) Sep 12 17:53:56.707000 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:53:56.707006 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 17:53:56.707012 kernel: Fallback order for Node 0: 0 Sep 12 17:53:56.707018 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Sep 12 17:53:56.707024 kernel: Policy zone: DMA32 Sep 12 17:53:56.707030 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:53:56.707042 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Sep 12 17:53:56.707058 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 17:53:56.707064 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 17:53:56.707070 kernel: Dynamic Preempt: voluntary Sep 12 17:53:56.707076 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:53:56.707082 kernel: rcu: RCU event tracing is enabled. Sep 12 17:53:56.707088 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Sep 12 17:53:56.707094 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:53:56.707099 kernel: Rude variant of Tasks RCU enabled. Sep 12 17:53:56.707107 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:53:56.707113 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:53:56.707119 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Sep 12 17:53:56.707125 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 17:53:56.707138 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 17:53:56.707146 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 17:53:56.707152 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Sep 12 17:53:56.707158 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Sep 12 17:53:56.707164 kernel: Console: colour VGA+ 80x25 Sep 12 17:53:56.707171 kernel: printk: legacy console [tty0] enabled Sep 12 17:53:56.707177 kernel: printk: legacy console [ttyS0] enabled Sep 12 17:53:56.707183 kernel: ACPI: Core revision 20240827 Sep 12 17:53:56.707188 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Sep 12 17:53:56.707194 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 17:53:56.707200 kernel: x2apic enabled Sep 12 17:53:56.707206 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 17:53:56.707212 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 17:53:56.707218 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 12 17:53:56.707225 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Sep 12 17:53:56.707230 kernel: Disabled fast string operations Sep 12 17:53:56.707236 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 12 17:53:56.707242 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 12 17:53:56.707248 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 17:53:56.707253 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 12 17:53:56.707259 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 12 17:53:56.707265 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 12 17:53:56.707271 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 12 17:53:56.707278 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 17:53:56.707284 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 17:53:56.707290 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 12 17:53:56.707296 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 12 17:53:56.707301 kernel: GDS: Unknown: Dependent on hypervisor status Sep 12 17:53:56.707307 kernel: active return thunk: its_return_thunk Sep 12 17:53:56.707313 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 12 17:53:56.707319 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 17:53:56.707324 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 17:53:56.707331 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 17:53:56.707337 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 17:53:56.707343 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 17:53:56.707349 kernel: Freeing SMP alternatives memory: 32K Sep 12 17:53:56.707354 kernel: pid_max: default: 131072 minimum: 1024 Sep 12 17:53:56.707360 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 17:53:56.707366 kernel: landlock: Up and running. Sep 12 17:53:56.707371 kernel: SELinux: Initializing. Sep 12 17:53:56.707377 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 17:53:56.707384 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 17:53:56.707390 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 12 17:53:56.707396 kernel: Performance Events: Skylake events, core PMU driver. Sep 12 17:53:56.707401 kernel: core: CPUID marked event: 'cpu cycles' unavailable Sep 12 17:53:56.707407 kernel: core: CPUID marked event: 'instructions' unavailable Sep 12 17:53:56.707413 kernel: core: CPUID marked event: 'bus cycles' unavailable Sep 12 17:53:56.707418 kernel: core: CPUID marked event: 'cache references' unavailable Sep 12 17:53:56.707424 kernel: core: CPUID marked event: 'cache misses' unavailable Sep 12 17:53:56.707431 kernel: core: CPUID marked event: 'branch instructions' unavailable Sep 12 17:53:56.707436 kernel: core: CPUID marked event: 'branch misses' unavailable Sep 12 17:53:56.707442 kernel: ... version: 1 Sep 12 17:53:56.707448 kernel: ... bit width: 48 Sep 12 17:53:56.707454 kernel: ... generic registers: 4 Sep 12 17:53:56.707459 kernel: ... value mask: 0000ffffffffffff Sep 12 17:53:56.707465 kernel: ... max period: 000000007fffffff Sep 12 17:53:56.707471 kernel: ... fixed-purpose events: 0 Sep 12 17:53:56.707477 kernel: ... event mask: 000000000000000f Sep 12 17:53:56.707483 kernel: signal: max sigframe size: 1776 Sep 12 17:53:56.707489 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:53:56.707495 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:53:56.707501 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Sep 12 17:53:56.707507 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 12 17:53:56.707512 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:53:56.707518 kernel: smpboot: x86: Booting SMP configuration: Sep 12 17:53:56.707524 kernel: .... node #0, CPUs: #1 Sep 12 17:53:56.707530 kernel: Disabled fast string operations Sep 12 17:53:56.707535 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:53:56.707542 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Sep 12 17:53:56.707548 kernel: Memory: 1924288K/2096628K available (14336K kernel code, 2432K rwdata, 9960K rodata, 54040K init, 2924K bss, 160968K reserved, 0K cma-reserved) Sep 12 17:53:56.707554 kernel: devtmpfs: initialized Sep 12 17:53:56.707560 kernel: x86/mm: Memory block size: 128MB Sep 12 17:53:56.707566 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Sep 12 17:53:56.707572 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:53:56.707577 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Sep 12 17:53:56.707583 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:53:56.707589 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:53:56.707596 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:53:56.707601 kernel: audit: type=2000 audit(1757699634.265:1): state=initialized audit_enabled=0 res=1 Sep 12 17:53:56.707607 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:53:56.707613 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 17:53:56.707619 kernel: cpuidle: using governor menu Sep 12 17:53:56.707624 kernel: Simple Boot Flag at 0x36 set to 0x80 Sep 12 17:53:56.707630 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:53:56.707636 kernel: dca service started, version 1.12.1 Sep 12 17:53:56.707649 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Sep 12 17:53:56.707656 kernel: PCI: Using configuration type 1 for base access Sep 12 17:53:56.707662 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 17:53:56.707668 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:53:56.707675 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:53:56.707681 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:53:56.707687 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:53:56.707693 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:53:56.707699 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:53:56.707705 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:53:56.707712 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 17:53:56.707718 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 12 17:53:56.707724 kernel: ACPI: Interpreter enabled Sep 12 17:53:56.707730 kernel: ACPI: PM: (supports S0 S1 S5) Sep 12 17:53:56.707736 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 17:53:56.707742 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 17:53:56.707748 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 17:53:56.707754 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Sep 12 17:53:56.707761 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Sep 12 17:53:56.707847 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 17:53:56.707904 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Sep 12 17:53:56.707956 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Sep 12 17:53:56.707965 kernel: PCI host bridge to bus 0000:00 Sep 12 17:53:56.708020 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 17:53:56.708080 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Sep 12 17:53:56.708135 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 12 17:53:56.708185 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 17:53:56.708231 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Sep 12 17:53:56.708277 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Sep 12 17:53:56.708338 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Sep 12 17:53:56.708399 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Sep 12 17:53:56.708456 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 17:53:56.708518 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 12 17:53:56.708582 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Sep 12 17:53:56.708635 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Sep 12 17:53:56.708701 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Sep 12 17:53:56.708753 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Sep 12 17:53:56.708806 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Sep 12 17:53:56.708858 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Sep 12 17:53:56.708915 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 12 17:53:56.708968 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Sep 12 17:53:56.709021 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Sep 12 17:53:56.709421 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Sep 12 17:53:56.709480 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Sep 12 17:53:56.709535 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Sep 12 17:53:56.709594 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Sep 12 17:53:56.709647 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Sep 12 17:53:56.709700 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Sep 12 17:53:56.709756 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Sep 12 17:53:56.711075 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Sep 12 17:53:56.711148 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 17:53:56.711212 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Sep 12 17:53:56.711269 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 12 17:53:56.711323 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 12 17:53:56.711376 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 12 17:53:56.711431 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 17:53:56.711489 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.711553 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 17:53:56.711606 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 12 17:53:56.711673 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 12 17:53:56.711735 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.711792 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.711850 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 17:53:56.711903 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 12 17:53:56.711956 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 12 17:53:56.712008 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 17:53:56.714075 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.714162 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.714223 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 17:53:56.714284 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 12 17:53:56.714338 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 12 17:53:56.714392 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 17:53:56.714446 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.714504 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.714558 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 17:53:56.714613 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 12 17:53:56.714666 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 17:53:56.714719 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.714776 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.714830 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 17:53:56.714883 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 12 17:53:56.714935 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 17:53:56.714991 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.715056 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.715110 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 17:53:56.715163 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 12 17:53:56.715216 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 17:53:56.715269 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.715327 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.715383 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 17:53:56.715435 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 12 17:53:56.715489 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 17:53:56.715541 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.715599 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.715653 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 17:53:56.715706 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 12 17:53:56.715759 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 17:53:56.715815 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.715878 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.715931 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 17:53:56.718134 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 12 17:53:56.718201 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 12 17:53:56.718260 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.718320 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.718379 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 17:53:56.718433 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 12 17:53:56.718486 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 12 17:53:56.718543 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 17:53:56.718598 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.718657 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.718711 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 17:53:56.718766 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 12 17:53:56.718819 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 12 17:53:56.718871 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 17:53:56.718924 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.718984 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.719520 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 17:53:56.721057 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 12 17:53:56.721132 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 17:53:56.721191 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.721250 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.721306 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 17:53:56.721359 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 12 17:53:56.721412 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 17:53:56.721465 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.721525 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.721579 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 17:53:56.721633 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 12 17:53:56.721686 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 17:53:56.721741 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.721798 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.721852 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 17:53:56.721907 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 12 17:53:56.721959 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 17:53:56.722012 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.722110 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.722172 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 17:53:56.722227 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 12 17:53:56.722280 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 17:53:56.722339 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.722398 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.722451 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 17:53:56.722504 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 12 17:53:56.722557 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 12 17:53:56.722609 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 17:53:56.722661 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.722720 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.722776 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 17:53:56.722828 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 12 17:53:56.722881 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 12 17:53:56.722936 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 17:53:56.722988 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.723289 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.723351 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 17:53:56.723407 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 12 17:53:56.723460 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 12 17:53:56.723513 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 17:53:56.723570 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.723630 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.723683 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 17:53:56.723735 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 12 17:53:56.723788 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 17:53:56.723839 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.723896 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.723953 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 17:53:56.724005 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 12 17:53:56.724968 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 17:53:56.725033 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.725106 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.725167 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 17:53:56.725221 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 12 17:53:56.725275 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 17:53:56.725336 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.725401 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.725466 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 17:53:56.725523 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 12 17:53:56.725576 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 17:53:56.725628 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.725686 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.725743 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 17:53:56.725794 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 12 17:53:56.725846 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 17:53:56.725897 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.725953 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.726006 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 17:53:56.726745 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 12 17:53:56.726814 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 12 17:53:56.726869 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 17:53:56.726925 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.726983 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.727035 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 17:53:56.728122 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 12 17:53:56.728187 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 12 17:53:56.728243 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 17:53:56.728294 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.728352 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.728404 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 17:53:56.728456 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 12 17:53:56.728523 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 17:53:56.728576 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.728635 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.728689 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 17:53:56.728742 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 12 17:53:56.728794 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 17:53:56.728846 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.728905 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.728958 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 17:53:56.729014 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 12 17:53:56.729081 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 17:53:56.729136 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.729193 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.729247 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 17:53:56.729300 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 12 17:53:56.729353 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 17:53:56.729409 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.729481 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.729533 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 17:53:56.729585 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 12 17:53:56.729635 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 17:53:56.729687 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.729744 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 17:53:56.729798 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 17:53:56.729849 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 12 17:53:56.729900 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 17:53:56.729951 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.730009 kernel: pci_bus 0000:01: extended config space not accessible Sep 12 17:53:56.730816 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 17:53:56.730877 kernel: pci_bus 0000:02: extended config space not accessible Sep 12 17:53:56.730887 kernel: acpiphp: Slot [32] registered Sep 12 17:53:56.730896 kernel: acpiphp: Slot [33] registered Sep 12 17:53:56.730903 kernel: acpiphp: Slot [34] registered Sep 12 17:53:56.730909 kernel: acpiphp: Slot [35] registered Sep 12 17:53:56.730915 kernel: acpiphp: Slot [36] registered Sep 12 17:53:56.730921 kernel: acpiphp: Slot [37] registered Sep 12 17:53:56.730927 kernel: acpiphp: Slot [38] registered Sep 12 17:53:56.730933 kernel: acpiphp: Slot [39] registered Sep 12 17:53:56.730939 kernel: acpiphp: Slot [40] registered Sep 12 17:53:56.730945 kernel: acpiphp: Slot [41] registered Sep 12 17:53:56.730952 kernel: acpiphp: Slot [42] registered Sep 12 17:53:56.733588 kernel: acpiphp: Slot [43] registered Sep 12 17:53:56.733595 kernel: acpiphp: Slot [44] registered Sep 12 17:53:56.733601 kernel: acpiphp: Slot [45] registered Sep 12 17:53:56.733608 kernel: acpiphp: Slot [46] registered Sep 12 17:53:56.733614 kernel: acpiphp: Slot [47] registered Sep 12 17:53:56.733620 kernel: acpiphp: Slot [48] registered Sep 12 17:53:56.733626 kernel: acpiphp: Slot [49] registered Sep 12 17:53:56.733633 kernel: acpiphp: Slot [50] registered Sep 12 17:53:56.733639 kernel: acpiphp: Slot [51] registered Sep 12 17:53:56.733647 kernel: acpiphp: Slot [52] registered Sep 12 17:53:56.733654 kernel: acpiphp: Slot [53] registered Sep 12 17:53:56.733660 kernel: acpiphp: Slot [54] registered Sep 12 17:53:56.733666 kernel: acpiphp: Slot [55] registered Sep 12 17:53:56.733672 kernel: acpiphp: Slot [56] registered Sep 12 17:53:56.733678 kernel: acpiphp: Slot [57] registered Sep 12 17:53:56.733684 kernel: acpiphp: Slot [58] registered Sep 12 17:53:56.733690 kernel: acpiphp: Slot [59] registered Sep 12 17:53:56.733696 kernel: acpiphp: Slot [60] registered Sep 12 17:53:56.733704 kernel: acpiphp: Slot [61] registered Sep 12 17:53:56.733710 kernel: acpiphp: Slot [62] registered Sep 12 17:53:56.733717 kernel: acpiphp: Slot [63] registered Sep 12 17:53:56.733784 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 12 17:53:56.733856 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Sep 12 17:53:56.733913 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Sep 12 17:53:56.733968 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Sep 12 17:53:56.734031 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Sep 12 17:53:56.734103 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Sep 12 17:53:56.734182 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Sep 12 17:53:56.734246 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Sep 12 17:53:56.734302 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Sep 12 17:53:56.734356 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 12 17:53:56.734410 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 12 17:53:56.734463 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 12 17:53:56.734521 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 17:53:56.734577 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 17:53:56.734632 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 17:53:56.734688 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 17:53:56.734744 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 17:53:56.734798 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 17:53:56.734853 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 17:53:56.734907 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 17:53:56.734969 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Sep 12 17:53:56.735023 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Sep 12 17:53:56.735095 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Sep 12 17:53:56.735150 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Sep 12 17:53:56.735205 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Sep 12 17:53:56.735258 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 12 17:53:56.735312 kernel: pci 0000:0b:00.0: supports D1 D2 Sep 12 17:53:56.735368 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 17:53:56.735421 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 12 17:53:56.735476 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 17:53:56.735546 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 17:53:56.735601 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 17:53:56.735654 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 17:53:56.735707 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 17:53:56.735763 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 17:53:56.735817 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 17:53:56.735870 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 17:53:56.735923 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 17:53:56.735975 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 17:53:56.736028 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 17:53:56.736127 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 17:53:56.736182 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 17:53:56.736238 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 17:53:56.736291 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 17:53:56.736344 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 17:53:56.736396 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 17:53:56.736449 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 17:53:56.736502 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 17:53:56.736555 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 17:53:56.736609 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 17:53:56.736662 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 17:53:56.736714 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 17:53:56.736766 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 17:53:56.736775 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Sep 12 17:53:56.736781 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Sep 12 17:53:56.736788 kernel: ACPI: PCI: Interrupt link LNKB disabled Sep 12 17:53:56.736794 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 17:53:56.736802 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Sep 12 17:53:56.736808 kernel: iommu: Default domain type: Translated Sep 12 17:53:56.736814 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 17:53:56.736820 kernel: PCI: Using ACPI for IRQ routing Sep 12 17:53:56.736826 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 17:53:56.736832 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Sep 12 17:53:56.736840 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Sep 12 17:53:56.736897 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Sep 12 17:53:56.736948 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Sep 12 17:53:56.737001 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 17:53:56.737010 kernel: vgaarb: loaded Sep 12 17:53:56.737016 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Sep 12 17:53:56.737022 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Sep 12 17:53:56.737028 kernel: clocksource: Switched to clocksource tsc-early Sep 12 17:53:56.737034 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:53:56.737056 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:53:56.737063 kernel: pnp: PnP ACPI init Sep 12 17:53:56.737138 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Sep 12 17:53:56.737199 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Sep 12 17:53:56.737247 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Sep 12 17:53:56.737299 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Sep 12 17:53:56.737350 kernel: pnp 00:06: [dma 2] Sep 12 17:53:56.737402 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Sep 12 17:53:56.737450 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Sep 12 17:53:56.737499 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Sep 12 17:53:56.737508 kernel: pnp: PnP ACPI: found 8 devices Sep 12 17:53:56.737514 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 17:53:56.737520 kernel: NET: Registered PF_INET protocol family Sep 12 17:53:56.737526 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 17:53:56.737533 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 12 17:53:56.737539 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:53:56.737545 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 17:53:56.737553 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 17:53:56.737559 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 12 17:53:56.737565 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 17:53:56.737571 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 17:53:56.737577 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:53:56.737583 kernel: NET: Registered PF_XDP protocol family Sep 12 17:53:56.737635 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Sep 12 17:53:56.737688 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 12 17:53:56.737740 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 17:53:56.737794 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 17:53:56.737847 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 17:53:56.737899 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Sep 12 17:53:56.737952 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Sep 12 17:53:56.738004 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Sep 12 17:53:56.738076 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Sep 12 17:53:56.738137 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Sep 12 17:53:56.738197 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Sep 12 17:53:56.738249 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Sep 12 17:53:56.738301 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Sep 12 17:53:56.738353 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Sep 12 17:53:56.738404 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Sep 12 17:53:56.738455 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Sep 12 17:53:56.738524 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Sep 12 17:53:56.738577 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Sep 12 17:53:56.738631 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Sep 12 17:53:56.738684 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Sep 12 17:53:56.738735 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Sep 12 17:53:56.738787 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Sep 12 17:53:56.738840 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Sep 12 17:53:56.738893 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Sep 12 17:53:56.738945 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Sep 12 17:53:56.738998 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739062 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739116 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739174 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739227 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739280 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739331 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739383 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739436 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739491 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739543 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739595 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739647 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739700 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739752 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739804 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739859 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.739912 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.739964 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740016 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740082 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740137 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740189 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740242 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740297 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740350 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740402 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740471 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740532 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740584 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740635 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740686 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740740 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740791 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740842 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740893 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.740944 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.740995 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741057 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741111 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741170 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741221 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741273 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741324 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741374 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741425 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741493 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741543 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741593 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741646 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741696 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741746 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741796 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741845 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741896 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.741946 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.741996 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742057 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742112 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742168 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742218 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742268 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742317 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742367 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742416 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742466 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742515 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742565 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742617 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742667 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742717 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742766 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742816 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742865 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.742918 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.742968 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.743018 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.743074 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.743126 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.743177 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.743226 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.743285 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.743336 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 17:53:56.743389 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 12 17:53:56.743441 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 17:53:56.743492 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Sep 12 17:53:56.743542 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 12 17:53:56.743591 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 12 17:53:56.743641 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 17:53:56.743693 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Sep 12 17:53:56.743744 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 17:53:56.743796 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 12 17:53:56.743846 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 12 17:53:56.743896 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Sep 12 17:53:56.743948 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 17:53:56.743997 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 12 17:53:56.744059 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 12 17:53:56.744112 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 17:53:56.744167 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 17:53:56.744217 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 12 17:53:56.744267 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 12 17:53:56.744319 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 17:53:56.744370 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 17:53:56.744420 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 12 17:53:56.744480 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 17:53:56.744538 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 17:53:56.744590 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 12 17:53:56.744639 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 17:53:56.744690 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 17:53:56.744743 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 12 17:53:56.744792 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 17:53:56.744842 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 17:53:56.744892 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 12 17:53:56.744942 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 17:53:56.744992 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 17:53:56.745056 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 12 17:53:56.745115 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 17:53:56.745172 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Sep 12 17:53:56.745223 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 17:53:56.745273 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 12 17:53:56.745323 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 12 17:53:56.745373 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Sep 12 17:53:56.745424 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 17:53:56.745473 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 12 17:53:56.745523 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 12 17:53:56.745576 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 17:53:56.745628 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 17:53:56.745678 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 12 17:53:56.745728 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 12 17:53:56.745778 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 17:53:56.745828 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 17:53:56.745878 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 12 17:53:56.745928 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 17:53:56.745981 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 17:53:56.746031 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 12 17:53:56.746092 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 17:53:56.746142 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 17:53:56.746192 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 12 17:53:56.746241 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 17:53:56.746299 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 17:53:56.746350 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 12 17:53:56.746403 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 17:53:56.746454 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 17:53:56.746503 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 12 17:53:56.746553 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 17:53:56.746604 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 17:53:56.746654 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 12 17:53:56.746704 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 12 17:53:56.746755 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 17:53:56.746808 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 17:53:56.746857 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 12 17:53:56.746907 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 12 17:53:56.746957 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 17:53:56.747007 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 17:53:56.747073 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 12 17:53:56.747124 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 12 17:53:56.747178 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 17:53:56.747228 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 17:53:56.747281 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 12 17:53:56.747330 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 17:53:56.747381 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 17:53:56.747430 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 12 17:53:56.747480 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 17:53:56.747530 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 17:53:56.747580 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 12 17:53:56.747629 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 17:53:56.747682 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 17:53:56.747732 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 12 17:53:56.747782 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 17:53:56.747832 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 17:53:56.747881 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 12 17:53:56.747930 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 17:53:56.747982 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 17:53:56.748034 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 12 17:53:56.748093 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 12 17:53:56.748154 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 17:53:56.748208 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 17:53:56.748258 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 12 17:53:56.748308 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 12 17:53:56.748358 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 17:53:56.748408 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 17:53:56.748459 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 12 17:53:56.748509 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 17:53:56.748563 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 17:53:56.748613 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 12 17:53:56.748662 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 17:53:56.748712 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 17:53:56.748761 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 12 17:53:56.748811 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 17:53:56.748864 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 17:53:56.748914 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 12 17:53:56.748964 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 17:53:56.749014 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 17:53:56.749121 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 12 17:53:56.749173 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 17:53:56.749225 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 17:53:56.749275 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 12 17:53:56.749329 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 17:53:56.749378 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Sep 12 17:53:56.749424 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 12 17:53:56.749468 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 12 17:53:56.749512 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Sep 12 17:53:56.749556 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Sep 12 17:53:56.749604 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Sep 12 17:53:56.749653 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Sep 12 17:53:56.749699 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 17:53:56.749744 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Sep 12 17:53:56.749790 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 12 17:53:56.749835 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 12 17:53:56.749880 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Sep 12 17:53:56.749937 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Sep 12 17:53:56.749991 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Sep 12 17:53:56.750049 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Sep 12 17:53:56.750112 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Sep 12 17:53:56.750170 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Sep 12 17:53:56.750242 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Sep 12 17:53:56.750288 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 17:53:56.750337 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Sep 12 17:53:56.750385 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Sep 12 17:53:56.750436 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 17:53:56.750486 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Sep 12 17:53:56.750531 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 17:53:56.750580 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Sep 12 17:53:56.750626 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 17:53:56.750711 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Sep 12 17:53:56.750756 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 17:53:56.750804 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Sep 12 17:53:56.750868 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 17:53:56.750918 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Sep 12 17:53:56.750964 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 17:53:56.751018 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Sep 12 17:53:56.751081 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Sep 12 17:53:56.751130 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Sep 12 17:53:56.751182 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Sep 12 17:53:56.751228 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Sep 12 17:53:56.751273 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 17:53:56.751324 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Sep 12 17:53:56.751371 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Sep 12 17:53:56.751416 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 17:53:56.751465 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Sep 12 17:53:56.751510 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 17:53:56.751561 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Sep 12 17:53:56.751607 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 17:53:56.751660 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Sep 12 17:53:56.751706 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 17:53:56.751758 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Sep 12 17:53:56.751815 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 17:53:56.751866 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Sep 12 17:53:56.751912 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 17:53:56.751966 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Sep 12 17:53:56.752013 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Sep 12 17:53:56.752079 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 17:53:56.752140 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Sep 12 17:53:56.752187 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Sep 12 17:53:56.752233 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 17:53:56.752282 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Sep 12 17:53:56.752332 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Sep 12 17:53:56.752378 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 17:53:56.752426 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Sep 12 17:53:56.752473 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 17:53:56.752523 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Sep 12 17:53:56.752569 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 17:53:56.752623 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Sep 12 17:53:56.752671 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 17:53:56.753053 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Sep 12 17:53:56.753113 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 17:53:56.753166 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Sep 12 17:53:56.753213 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 17:53:56.753266 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Sep 12 17:53:56.753313 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Sep 12 17:53:56.753358 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 17:53:56.753410 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Sep 12 17:53:56.753457 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Sep 12 17:53:56.753503 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 17:53:56.753552 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Sep 12 17:53:56.753601 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 17:53:56.753651 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Sep 12 17:53:56.753697 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 17:53:56.753746 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Sep 12 17:53:56.753792 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 17:53:56.753844 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Sep 12 17:53:56.753893 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 17:53:56.753943 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Sep 12 17:53:56.753989 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 17:53:56.754605 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Sep 12 17:53:56.754666 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 17:53:56.754730 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 12 17:53:56.754741 kernel: PCI: CLS 32 bytes, default 64 Sep 12 17:53:56.754749 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 12 17:53:56.754756 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 12 17:53:56.754762 kernel: clocksource: Switched to clocksource tsc Sep 12 17:53:56.754768 kernel: Initialise system trusted keyrings Sep 12 17:53:56.754774 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 12 17:53:56.754780 kernel: Key type asymmetric registered Sep 12 17:53:56.754786 kernel: Asymmetric key parser 'x509' registered Sep 12 17:53:56.754791 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 17:53:56.754798 kernel: io scheduler mq-deadline registered Sep 12 17:53:56.754805 kernel: io scheduler kyber registered Sep 12 17:53:56.754811 kernel: io scheduler bfq registered Sep 12 17:53:56.754865 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Sep 12 17:53:56.754918 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.754971 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Sep 12 17:53:56.755022 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756107 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Sep 12 17:53:56.756170 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756225 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Sep 12 17:53:56.756279 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756332 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Sep 12 17:53:56.756383 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756435 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Sep 12 17:53:56.756486 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756539 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Sep 12 17:53:56.756591 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756643 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Sep 12 17:53:56.756693 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756746 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Sep 12 17:53:56.756796 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756847 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Sep 12 17:53:56.756899 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.756951 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Sep 12 17:53:56.757001 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758085 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Sep 12 17:53:56.758164 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758224 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Sep 12 17:53:56.758278 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758333 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Sep 12 17:53:56.758387 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758439 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Sep 12 17:53:56.758490 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758542 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Sep 12 17:53:56.758593 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758644 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Sep 12 17:53:56.758696 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758750 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Sep 12 17:53:56.758801 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758855 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Sep 12 17:53:56.758905 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.758957 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Sep 12 17:53:56.759008 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.759071 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Sep 12 17:53:56.759122 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.759177 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Sep 12 17:53:56.759228 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.759280 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Sep 12 17:53:56.759332 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.759383 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Sep 12 17:53:56.759435 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.759488 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Sep 12 17:53:56.759541 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.759593 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Sep 12 17:53:56.759644 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.759696 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Sep 12 17:53:56.759944 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.760004 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Sep 12 17:53:56.760399 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.760460 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Sep 12 17:53:56.760518 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.760573 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Sep 12 17:53:56.760625 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.760678 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Sep 12 17:53:56.760729 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.760782 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Sep 12 17:53:56.760833 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 17:53:56.760844 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 17:53:56.760851 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:53:56.760858 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 17:53:56.760864 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Sep 12 17:53:56.760870 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 17:53:56.760877 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 17:53:56.760931 kernel: rtc_cmos 00:01: registered as rtc0 Sep 12 17:53:56.760943 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 17:53:56.760990 kernel: rtc_cmos 00:01: setting system clock to 2025-09-12T17:53:56 UTC (1757699636) Sep 12 17:53:56.761383 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Sep 12 17:53:56.761398 kernel: intel_pstate: CPU model not supported Sep 12 17:53:56.761405 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:53:56.761412 kernel: Segment Routing with IPv6 Sep 12 17:53:56.761418 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:53:56.761424 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:53:56.761431 kernel: Key type dns_resolver registered Sep 12 17:53:56.761439 kernel: IPI shorthand broadcast: enabled Sep 12 17:53:56.761446 kernel: sched_clock: Marking stable (2587444931, 162521932)->(2760842072, -10875209) Sep 12 17:53:56.761452 kernel: registered taskstats version 1 Sep 12 17:53:56.761458 kernel: Loading compiled-in X.509 certificates Sep 12 17:53:56.761465 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: f1ae8d6e9bfae84d90f4136cf098b0465b2a5bd7' Sep 12 17:53:56.761471 kernel: Demotion targets for Node 0: null Sep 12 17:53:56.761477 kernel: Key type .fscrypt registered Sep 12 17:53:56.761484 kernel: Key type fscrypt-provisioning registered Sep 12 17:53:56.761490 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 17:53:56.761497 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:53:56.761504 kernel: ima: No architecture policies found Sep 12 17:53:56.761510 kernel: clk: Disabling unused clocks Sep 12 17:53:56.761516 kernel: Warning: unable to open an initial console. Sep 12 17:53:56.761523 kernel: Freeing unused kernel image (initmem) memory: 54040K Sep 12 17:53:56.761529 kernel: Write protecting the kernel read-only data: 24576k Sep 12 17:53:56.761536 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 12 17:53:56.761542 kernel: Run /init as init process Sep 12 17:53:56.761549 kernel: with arguments: Sep 12 17:53:56.761556 kernel: /init Sep 12 17:53:56.761562 kernel: with environment: Sep 12 17:53:56.761568 kernel: HOME=/ Sep 12 17:53:56.761575 kernel: TERM=linux Sep 12 17:53:56.761581 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:53:56.761588 systemd[1]: Successfully made /usr/ read-only. Sep 12 17:53:56.761597 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:53:56.761605 systemd[1]: Detected virtualization vmware. Sep 12 17:53:56.761612 systemd[1]: Detected architecture x86-64. Sep 12 17:53:56.761618 systemd[1]: Running in initrd. Sep 12 17:53:56.761625 systemd[1]: No hostname configured, using default hostname. Sep 12 17:53:56.761631 systemd[1]: Hostname set to . Sep 12 17:53:56.761638 systemd[1]: Initializing machine ID from random generator. Sep 12 17:53:56.761644 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:53:56.761651 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:53:56.761657 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:53:56.761666 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:53:56.761674 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:53:56.761680 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:53:56.761688 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:53:56.761695 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:53:56.761701 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:53:56.761709 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:53:56.761716 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:53:56.761722 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:53:56.761729 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:53:56.761735 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:53:56.761742 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:53:56.761749 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:53:56.761755 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:53:56.761762 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:53:56.761769 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 17:53:56.761776 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:53:56.761783 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:53:56.761789 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:53:56.761795 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:53:56.761802 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:53:56.761808 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:53:56.761815 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:53:56.761823 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 17:53:56.761831 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:53:56.761837 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:53:56.761845 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:53:56.761851 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:53:56.761858 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:53:56.761882 systemd-journald[244]: Collecting audit messages is disabled. Sep 12 17:53:56.761900 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:53:56.761907 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:53:56.761915 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:53:56.761922 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:53:56.761929 kernel: Bridge firewalling registered Sep 12 17:53:56.761935 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:53:56.761942 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:53:56.761949 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:53:56.761956 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:53:56.761963 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:53:56.761971 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:53:56.761978 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:53:56.761985 systemd-journald[244]: Journal started Sep 12 17:53:56.762001 systemd-journald[244]: Runtime Journal (/run/log/journal/c7b6d8e545084b1f95e13ca4cbba534d) is 4.8M, max 38.8M, 34M free. Sep 12 17:53:56.712491 systemd-modules-load[245]: Inserted module 'overlay' Sep 12 17:53:56.733180 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 12 17:53:56.763114 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:53:56.764110 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:53:56.766325 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:53:56.768788 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:53:56.770096 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:53:56.774838 systemd-tmpfiles[275]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 17:53:56.777133 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:53:56.779124 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:53:56.781889 dracut-cmdline[279]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:53:56.810513 systemd-resolved[289]: Positive Trust Anchors: Sep 12 17:53:56.810740 systemd-resolved[289]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:53:56.810765 systemd-resolved[289]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:53:56.813102 systemd-resolved[289]: Defaulting to hostname 'linux'. Sep 12 17:53:56.813758 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:53:56.813905 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:53:56.836052 kernel: SCSI subsystem initialized Sep 12 17:53:56.853052 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:53:56.861048 kernel: iscsi: registered transport (tcp) Sep 12 17:53:56.882258 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:53:56.882280 kernel: QLogic iSCSI HBA Driver Sep 12 17:53:56.892662 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:53:56.904775 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:53:56.905572 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:53:56.927559 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:53:56.928374 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:53:56.965052 kernel: raid6: avx2x4 gen() 48745 MB/s Sep 12 17:53:56.982049 kernel: raid6: avx2x2 gen() 53395 MB/s Sep 12 17:53:56.999247 kernel: raid6: avx2x1 gen() 44403 MB/s Sep 12 17:53:56.999261 kernel: raid6: using algorithm avx2x2 gen() 53395 MB/s Sep 12 17:53:57.017197 kernel: raid6: .... xor() 32416 MB/s, rmw enabled Sep 12 17:53:57.017210 kernel: raid6: using avx2x2 recovery algorithm Sep 12 17:53:57.031049 kernel: xor: automatically using best checksumming function avx Sep 12 17:53:57.130055 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:53:57.133642 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:53:57.134652 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:53:57.154619 systemd-udevd[494]: Using default interface naming scheme 'v255'. Sep 12 17:53:57.157844 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:53:57.158804 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:53:57.173128 dracut-pre-trigger[500]: rd.md=0: removing MD RAID activation Sep 12 17:53:57.185337 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:53:57.186170 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:53:57.261888 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:53:57.263604 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:53:57.328060 kernel: VMware PVSCSI driver - version 1.0.7.0-k Sep 12 17:53:57.333016 kernel: vmw_pvscsi: using 64bit dma Sep 12 17:53:57.333054 kernel: vmw_pvscsi: max_id: 16 Sep 12 17:53:57.333066 kernel: vmw_pvscsi: setting ring_pages to 8 Sep 12 17:53:57.337143 kernel: vmw_pvscsi: enabling reqCallThreshold Sep 12 17:53:57.337160 kernel: vmw_pvscsi: driver-based request coalescing enabled Sep 12 17:53:57.337172 kernel: vmw_pvscsi: using MSI-X Sep 12 17:53:57.337180 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Sep 12 17:53:57.338047 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Sep 12 17:53:57.340046 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Sep 12 17:53:57.344074 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Sep 12 17:53:57.344177 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Sep 12 17:53:57.344246 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Sep 12 17:53:57.367961 (udev-worker)[536]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 12 17:53:57.368250 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Sep 12 17:53:57.374032 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:53:57.374115 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:53:57.374576 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:53:57.375222 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:53:57.384049 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 17:53:57.386057 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Sep 12 17:53:57.390066 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Sep 12 17:53:57.391410 kernel: libata version 3.00 loaded. Sep 12 17:53:57.391426 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 17:53:57.391512 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Sep 12 17:53:57.392876 kernel: sd 0:0:0:0: [sda] Cache data unavailable Sep 12 17:53:57.392954 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Sep 12 17:53:57.394492 kernel: AES CTR mode by8 optimization enabled Sep 12 17:53:57.404076 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:53:57.404100 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 17:53:57.410052 kernel: ata_piix 0000:00:07.1: version 2.13 Sep 12 17:53:57.411403 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:53:57.414120 kernel: scsi host1: ata_piix Sep 12 17:53:57.414200 kernel: scsi host2: ata_piix Sep 12 17:53:57.416183 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Sep 12 17:53:57.416203 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Sep 12 17:53:57.443099 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Sep 12 17:53:57.452135 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Sep 12 17:53:57.457420 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 12 17:53:57.461737 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Sep 12 17:53:57.461973 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Sep 12 17:53:57.462629 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:53:57.502296 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:53:57.516056 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:53:57.584105 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Sep 12 17:53:57.591061 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Sep 12 17:53:57.632089 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Sep 12 17:53:57.632218 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 17:53:57.654078 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 12 17:53:57.977416 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:53:57.977835 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:53:57.978004 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:53:57.978283 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:53:57.979113 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:53:57.992028 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:53:58.513072 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:53:58.513810 disk-uuid[642]: The operation has completed successfully. Sep 12 17:53:58.582569 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:53:58.582642 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:53:58.583437 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:53:58.592812 sh[673]: Success Sep 12 17:53:58.606969 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:53:58.606989 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:53:58.606997 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 17:53:58.614051 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 12 17:53:58.641146 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:53:58.643076 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:53:58.652178 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:53:58.662053 kernel: BTRFS: device fsid 74707491-1b86-4926-8bdb-c533ce2a0c32 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (685) Sep 12 17:53:58.662072 kernel: BTRFS info (device dm-0): first mount of filesystem 74707491-1b86-4926-8bdb-c533ce2a0c32 Sep 12 17:53:58.664055 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:53:58.671091 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 17:53:58.671108 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:53:58.671115 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 17:53:58.673973 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:53:58.674299 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:53:58.674871 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Sep 12 17:53:58.676099 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:53:58.707053 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (708) Sep 12 17:53:58.710297 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:53:58.710317 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:53:58.717387 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:53:58.717405 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:53:58.722059 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:53:58.723056 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:53:58.724153 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:53:58.741826 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 12 17:53:58.742635 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:53:58.840377 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:53:58.841606 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:53:58.841801 ignition[729]: Ignition 2.21.0 Sep 12 17:53:58.841805 ignition[729]: Stage: fetch-offline Sep 12 17:53:58.841925 ignition[729]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:53:58.841932 ignition[729]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 17:53:58.842004 ignition[729]: parsed url from cmdline: "" Sep 12 17:53:58.842007 ignition[729]: no config URL provided Sep 12 17:53:58.842013 ignition[729]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:53:58.842017 ignition[729]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:53:58.842380 ignition[729]: config successfully fetched Sep 12 17:53:58.842398 ignition[729]: parsing config with SHA512: 329faa81f7686cbc5529dc16a6d69cf374179dfe5c29dccdf87d2d6eff9d2b55a8511d4166a632053e7bc06efc328f5ce16c924a1933e3b504e064b8737f1117 Sep 12 17:53:58.846597 unknown[729]: fetched base config from "system" Sep 12 17:53:58.846602 unknown[729]: fetched user config from "vmware" Sep 12 17:53:58.846820 ignition[729]: fetch-offline: fetch-offline passed Sep 12 17:53:58.846857 ignition[729]: Ignition finished successfully Sep 12 17:53:58.849266 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:53:58.863181 systemd-networkd[867]: lo: Link UP Sep 12 17:53:58.863339 systemd-networkd[867]: lo: Gained carrier Sep 12 17:53:58.864030 systemd-networkd[867]: Enumeration completed Sep 12 17:53:58.864220 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:53:58.864467 systemd[1]: Reached target network.target - Network. Sep 12 17:53:58.864597 systemd-networkd[867]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Sep 12 17:53:58.866286 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 17:53:58.867600 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 12 17:53:58.867696 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 12 17:53:58.867653 systemd-networkd[867]: ens192: Link UP Sep 12 17:53:58.867655 systemd-networkd[867]: ens192: Gained carrier Sep 12 17:53:58.868188 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:53:58.883857 ignition[871]: Ignition 2.21.0 Sep 12 17:53:58.883866 ignition[871]: Stage: kargs Sep 12 17:53:58.883945 ignition[871]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:53:58.883951 ignition[871]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 17:53:58.885434 ignition[871]: kargs: kargs passed Sep 12 17:53:58.885465 ignition[871]: Ignition finished successfully Sep 12 17:53:58.887024 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:53:58.887817 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:53:58.900590 ignition[878]: Ignition 2.21.0 Sep 12 17:53:58.900602 ignition[878]: Stage: disks Sep 12 17:53:58.900678 ignition[878]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:53:58.900683 ignition[878]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 17:53:58.902796 ignition[878]: disks: disks passed Sep 12 17:53:58.902828 ignition[878]: Ignition finished successfully Sep 12 17:53:58.903607 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:53:58.903932 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:53:58.904360 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:53:58.904644 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:53:58.904863 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:53:58.905100 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:53:58.905780 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:53:58.919417 systemd-fsck[887]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 12 17:53:58.921279 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:53:58.922352 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:53:58.995055 kernel: EXT4-fs (sda9): mounted filesystem 26739aba-b0be-4ce3-bfbd-ca4dbcbe2426 r/w with ordered data mode. Quota mode: none. Sep 12 17:53:58.994970 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:53:58.995288 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:53:58.996344 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:53:58.998069 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:53:58.998476 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 17:53:58.998500 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:53:58.998513 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:53:59.004019 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:53:59.004749 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:53:59.010784 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (895) Sep 12 17:53:59.010806 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:53:59.011711 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:53:59.015052 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:53:59.015068 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:53:59.016392 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:53:59.036455 initrd-setup-root[919]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:53:59.038791 initrd-setup-root[926]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:53:59.040984 initrd-setup-root[933]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:53:59.043550 initrd-setup-root[940]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:53:59.097188 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:53:59.098099 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:53:59.099104 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:53:59.112050 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:53:59.129318 ignition[1008]: INFO : Ignition 2.21.0 Sep 12 17:53:59.129318 ignition[1008]: INFO : Stage: mount Sep 12 17:53:59.129640 ignition[1008]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:53:59.129640 ignition[1008]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 17:53:59.130532 ignition[1008]: INFO : mount: mount passed Sep 12 17:53:59.130532 ignition[1008]: INFO : Ignition finished successfully Sep 12 17:53:59.130021 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:53:59.131438 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:53:59.132297 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:53:59.850983 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:53:59.852262 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:53:59.887055 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1020) Sep 12 17:53:59.887083 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:53:59.889741 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:53:59.894423 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:53:59.894445 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:53:59.895792 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:53:59.913805 ignition[1036]: INFO : Ignition 2.21.0 Sep 12 17:53:59.913805 ignition[1036]: INFO : Stage: files Sep 12 17:53:59.914223 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:53:59.914223 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 17:53:59.914725 ignition[1036]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:53:59.915465 ignition[1036]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:53:59.915641 ignition[1036]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:53:59.917590 ignition[1036]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:53:59.917783 ignition[1036]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:53:59.918014 unknown[1036]: wrote ssh authorized keys file for user: core Sep 12 17:53:59.918341 ignition[1036]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:53:59.921007 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 17:53:59.921007 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 12 17:53:59.968533 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:54:00.227626 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:54:00.229955 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:54:00.229955 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:54:00.229955 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:54:00.231127 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:54:00.231488 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:54:00.231488 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 12 17:54:00.572229 systemd-networkd[867]: ens192: Gained IPv6LL Sep 12 17:54:00.703863 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:54:01.080630 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:54:01.080630 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 12 17:54:01.081835 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 12 17:54:01.081835 ignition[1036]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Sep 12 17:54:01.084332 ignition[1036]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:54:01.084835 ignition[1036]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:54:01.084835 ignition[1036]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Sep 12 17:54:01.084835 ignition[1036]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Sep 12 17:54:01.084835 ignition[1036]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 17:54:01.084835 ignition[1036]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 17:54:01.084835 ignition[1036]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Sep 12 17:54:01.084835 ignition[1036]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 17:54:01.107526 ignition[1036]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 17:54:01.109996 ignition[1036]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 17:54:01.109996 ignition[1036]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 17:54:01.109996 ignition[1036]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:54:01.109996 ignition[1036]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:54:01.111420 ignition[1036]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:54:01.111420 ignition[1036]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:54:01.111420 ignition[1036]: INFO : files: files passed Sep 12 17:54:01.111420 ignition[1036]: INFO : Ignition finished successfully Sep 12 17:54:01.112176 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:54:01.113187 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:54:01.114110 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:54:01.119282 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:54:01.119354 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:54:01.122895 initrd-setup-root-after-ignition[1069]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:54:01.122895 initrd-setup-root-after-ignition[1069]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:54:01.123984 initrd-setup-root-after-ignition[1073]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:54:01.124584 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:54:01.124877 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:54:01.125474 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:54:01.148240 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:54:01.148315 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:54:01.148576 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:54:01.148827 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:54:01.149028 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:54:01.149512 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:54:01.168567 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:54:01.169294 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:54:01.183611 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:54:01.183784 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:54:01.184009 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:54:01.184235 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:54:01.184301 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:54:01.184661 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:54:01.184821 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:54:01.185001 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:54:01.185194 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:54:01.185395 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:54:01.185606 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:54:01.185805 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:54:01.186004 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:54:01.186219 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:54:01.186427 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:54:01.186609 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:54:01.186775 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:54:01.186835 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:54:01.187094 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:54:01.187325 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:54:01.187508 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:54:01.187549 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:54:01.187724 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:54:01.187781 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:54:01.188070 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:54:01.188138 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:54:01.188353 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:54:01.188485 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:54:01.192058 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:54:01.192232 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:54:01.192444 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:54:01.192620 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:54:01.192668 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:54:01.192822 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:54:01.192867 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:54:01.193044 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:54:01.193112 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:54:01.193351 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:54:01.193410 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:54:01.194058 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:54:01.194168 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:54:01.194232 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:54:01.196112 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:54:01.196266 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:54:01.196334 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:54:01.196505 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:54:01.196566 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:54:01.198717 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:54:01.205081 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:54:01.212520 ignition[1093]: INFO : Ignition 2.21.0 Sep 12 17:54:01.212785 ignition[1093]: INFO : Stage: umount Sep 12 17:54:01.212989 ignition[1093]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:54:01.213123 ignition[1093]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 17:54:01.213214 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:54:01.214146 ignition[1093]: INFO : umount: umount passed Sep 12 17:54:01.214283 ignition[1093]: INFO : Ignition finished successfully Sep 12 17:54:01.214995 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:54:01.215179 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:54:01.215591 systemd[1]: Stopped target network.target - Network. Sep 12 17:54:01.215923 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:54:01.216096 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:54:01.216266 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:54:01.216288 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:54:01.216384 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:54:01.216405 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:54:01.216500 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:54:01.216521 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:54:01.216680 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:54:01.216804 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:54:01.218681 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:54:01.218869 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:54:01.220259 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 17:54:01.220528 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:54:01.220552 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:54:01.221441 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:54:01.224940 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:54:01.225150 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:54:01.225896 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 17:54:01.226004 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 17:54:01.226152 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:54:01.226174 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:54:01.227085 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:54:01.227185 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:54:01.227212 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:54:01.227845 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Sep 12 17:54:01.227868 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 12 17:54:01.228027 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:54:01.228109 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:54:01.228359 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:54:01.228381 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:54:01.229117 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:54:01.229707 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 17:54:01.237326 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:54:01.237580 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:54:01.238627 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:54:01.238880 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:54:01.239156 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:54:01.239175 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:54:01.239510 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:54:01.239533 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:54:01.239683 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:54:01.239706 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:54:01.239853 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:54:01.239876 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:54:01.241315 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:54:01.242142 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 17:54:01.242171 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:54:01.242596 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:54:01.242621 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:54:01.243003 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:54:01.243166 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:54:01.243801 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:54:01.247201 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:54:01.250113 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:54:01.250164 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:54:01.275675 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:54:01.275741 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:54:01.276087 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:54:01.276200 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:54:01.276229 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:54:01.276767 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:54:01.284780 systemd[1]: Switching root. Sep 12 17:54:01.325954 systemd-journald[244]: Journal stopped Sep 12 17:54:02.410857 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 12 17:54:02.410881 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:54:02.410889 kernel: SELinux: policy capability open_perms=1 Sep 12 17:54:02.410895 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:54:02.410900 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:54:02.410907 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:54:02.410913 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:54:02.410918 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:54:02.410924 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:54:02.410929 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 17:54:02.410935 kernel: audit: type=1403 audit(1757699641.916:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:54:02.410941 systemd[1]: Successfully loaded SELinux policy in 46.450ms. Sep 12 17:54:02.410949 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.600ms. Sep 12 17:54:02.410956 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:54:02.410963 systemd[1]: Detected virtualization vmware. Sep 12 17:54:02.410969 systemd[1]: Detected architecture x86-64. Sep 12 17:54:02.410977 systemd[1]: Detected first boot. Sep 12 17:54:02.410984 systemd[1]: Initializing machine ID from random generator. Sep 12 17:54:02.410990 zram_generator::config[1139]: No configuration found. Sep 12 17:54:02.411086 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Sep 12 17:54:02.411097 kernel: Guest personality initialized and is active Sep 12 17:54:02.411103 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 17:54:02.411108 kernel: Initialized host personality Sep 12 17:54:02.411116 kernel: NET: Registered PF_VSOCK protocol family Sep 12 17:54:02.411123 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:54:02.411134 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 17:54:02.411141 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Sep 12 17:54:02.411148 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 17:54:02.411155 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:54:02.411161 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:54:02.411169 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:54:02.411176 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:54:02.411183 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:54:02.411189 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:54:02.411196 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:54:02.411202 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:54:02.411209 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:54:02.411217 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:54:02.411223 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:54:02.411230 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:54:02.411238 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:54:02.411245 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:54:02.411252 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:54:02.411258 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:54:02.411265 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:54:02.411273 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 17:54:02.411280 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:54:02.411286 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:54:02.411293 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:54:02.411300 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:54:02.411306 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:54:02.411313 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:54:02.411320 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:54:02.411327 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:54:02.411334 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:54:02.411341 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:54:02.411347 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:54:02.411354 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:54:02.411362 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 17:54:02.411369 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:54:02.411376 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:54:02.411383 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:54:02.411390 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:54:02.411397 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:54:02.411404 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:54:02.411410 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:54:02.411419 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:54:02.411426 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:54:02.411433 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:54:02.411439 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:54:02.411447 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:54:02.411453 systemd[1]: Reached target machines.target - Containers. Sep 12 17:54:02.411460 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:54:02.411467 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Sep 12 17:54:02.411475 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:54:02.411482 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:54:02.411489 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:54:02.411495 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:54:02.411502 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:54:02.411509 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:54:02.411516 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:54:02.411523 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:54:02.411531 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:54:02.411538 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:54:02.411545 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:54:02.411552 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:54:02.411559 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:54:02.411566 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:54:02.411573 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:54:02.411580 kernel: fuse: init (API version 7.41) Sep 12 17:54:02.411586 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:54:02.411595 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:54:02.411601 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 17:54:02.411608 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:54:02.411615 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:54:02.411622 systemd[1]: Stopped verity-setup.service. Sep 12 17:54:02.411629 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:54:02.411635 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:54:02.411642 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:54:02.411650 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:54:02.411657 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:54:02.411664 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:54:02.411671 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:54:02.411677 kernel: loop: module loaded Sep 12 17:54:02.411684 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:54:02.411690 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:54:02.411709 systemd-journald[1229]: Collecting audit messages is disabled. Sep 12 17:54:02.411727 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:54:02.411734 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:54:02.411741 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:54:02.411748 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:54:02.411755 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:54:02.411763 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:54:02.411770 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:54:02.411777 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:54:02.411784 systemd-journald[1229]: Journal started Sep 12 17:54:02.411798 systemd-journald[1229]: Runtime Journal (/run/log/journal/601079776ab94c5e87cc41b0631d4018) is 4.8M, max 38.8M, 34M free. Sep 12 17:54:02.256809 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:54:02.272066 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:54:02.413563 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:54:02.413584 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:54:02.272276 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:54:02.413846 jq[1209]: true Sep 12 17:54:02.415498 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:54:02.415822 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:54:02.419672 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 17:54:02.425051 jq[1244]: true Sep 12 17:54:02.426471 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:54:02.431339 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:54:02.435774 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:54:02.440077 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:54:02.440208 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:54:02.440226 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:54:02.440844 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 17:54:02.444767 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:54:02.445672 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:54:02.451162 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:54:02.456534 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:54:02.458112 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:54:02.460733 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:54:02.460857 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:54:02.464995 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:54:02.470352 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:54:02.483016 ignition[1261]: Ignition 2.21.0 Sep 12 17:54:02.483382 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:54:02.483619 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:54:02.483965 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:54:02.489229 ignition[1261]: deleting config from guestinfo properties Sep 12 17:54:02.495183 systemd-journald[1229]: Time spent on flushing to /var/log/journal/601079776ab94c5e87cc41b0631d4018 is 45.003ms for 1758 entries. Sep 12 17:54:02.495183 systemd-journald[1229]: System Journal (/var/log/journal/601079776ab94c5e87cc41b0631d4018) is 8M, max 584.8M, 576.8M free. Sep 12 17:54:02.547162 systemd-journald[1229]: Received client request to flush runtime journal. Sep 12 17:54:02.547207 kernel: loop0: detected capacity change from 0 to 128016 Sep 12 17:54:02.547217 kernel: ACPI: bus type drm_connector registered Sep 12 17:54:02.498135 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:54:02.495455 ignition[1261]: Successfully deleted config Sep 12 17:54:02.500591 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Sep 12 17:54:02.512699 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:54:02.512874 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:54:02.514149 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 17:54:02.521423 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:54:02.521554 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:54:02.531205 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:54:02.549221 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:54:02.563870 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 17:54:02.576703 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:54:02.586266 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:54:02.589171 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:54:02.593321 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:54:02.595049 kernel: loop1: detected capacity change from 0 to 111000 Sep 12 17:54:02.613901 systemd-tmpfiles[1307]: ACLs are not supported, ignoring. Sep 12 17:54:02.614139 systemd-tmpfiles[1307]: ACLs are not supported, ignoring. Sep 12 17:54:02.616936 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:54:02.639097 kernel: loop2: detected capacity change from 0 to 224512 Sep 12 17:54:02.671054 kernel: loop3: detected capacity change from 0 to 2960 Sep 12 17:54:02.698055 kernel: loop4: detected capacity change from 0 to 128016 Sep 12 17:54:02.716073 kernel: loop5: detected capacity change from 0 to 111000 Sep 12 17:54:02.741106 kernel: loop6: detected capacity change from 0 to 224512 Sep 12 17:54:02.764069 kernel: loop7: detected capacity change from 0 to 2960 Sep 12 17:54:02.776497 (sd-merge)[1314]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Sep 12 17:54:02.777671 (sd-merge)[1314]: Merged extensions into '/usr'. Sep 12 17:54:02.786866 systemd[1]: Reload requested from client PID 1286 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:54:02.786877 systemd[1]: Reloading... Sep 12 17:54:02.842061 zram_generator::config[1342]: No configuration found. Sep 12 17:54:02.953206 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 17:54:03.001339 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:54:03.001690 systemd[1]: Reloading finished in 214 ms. Sep 12 17:54:03.019783 ldconfig[1277]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:54:03.040167 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:54:03.040504 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:54:03.049114 systemd[1]: Starting ensure-sysext.service... Sep 12 17:54:03.051106 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:54:03.055498 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:54:03.059021 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:54:03.061357 systemd[1]: Reload requested from client PID 1396 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:54:03.061366 systemd[1]: Reloading... Sep 12 17:54:03.064534 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 17:54:03.064703 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 17:54:03.064940 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:54:03.065196 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:54:03.065861 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:54:03.066111 systemd-tmpfiles[1397]: ACLs are not supported, ignoring. Sep 12 17:54:03.066227 systemd-tmpfiles[1397]: ACLs are not supported, ignoring. Sep 12 17:54:03.068651 systemd-tmpfiles[1397]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:54:03.068711 systemd-tmpfiles[1397]: Skipping /boot Sep 12 17:54:03.075487 systemd-tmpfiles[1397]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:54:03.075494 systemd-tmpfiles[1397]: Skipping /boot Sep 12 17:54:03.092825 systemd-udevd[1400]: Using default interface naming scheme 'v255'. Sep 12 17:54:03.101093 zram_generator::config[1425]: No configuration found. Sep 12 17:54:03.234969 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 17:54:03.264334 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 12 17:54:03.264382 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:54:03.287079 kernel: ACPI: button: Power Button [PWRF] Sep 12 17:54:03.293419 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 17:54:03.293651 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 12 17:54:03.294209 systemd[1]: Reloading finished in 232 ms. Sep 12 17:54:03.301002 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:54:03.304571 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:54:03.324152 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:54:03.325828 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:54:03.328247 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:54:03.331680 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:54:03.335368 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:54:03.337443 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:54:03.341221 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:54:03.343617 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:54:03.344443 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:54:03.345613 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:54:03.354381 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:54:03.354543 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:54:03.354607 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:54:03.354664 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:54:03.356811 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:54:03.356896 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:54:03.356946 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:54:03.359940 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:54:03.360176 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:54:03.361064 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:54:03.364313 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:54:03.365090 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:54:03.365697 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:54:03.367329 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:54:03.367565 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:54:03.367627 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:54:03.367697 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:54:03.367766 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:54:03.373485 systemd[1]: Finished ensure-sysext.service. Sep 12 17:54:03.376549 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 17:54:03.377161 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:54:03.380486 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:54:03.380622 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:54:03.380869 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:54:03.380971 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:54:03.381362 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:54:03.385475 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:54:03.385590 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:54:03.386210 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:54:03.389387 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:54:03.408722 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:54:03.413442 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:54:03.416175 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:54:03.416287 augenrules[1564]: No rules Sep 12 17:54:03.416336 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:54:03.417113 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:54:03.417404 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:54:03.420045 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Sep 12 17:54:03.489157 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 17:54:03.489318 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:54:03.498834 systemd-resolved[1523]: Positive Trust Anchors: Sep 12 17:54:03.498842 systemd-resolved[1523]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:54:03.498865 systemd-resolved[1523]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:54:03.501935 systemd-networkd[1522]: lo: Link UP Sep 12 17:54:03.501937 systemd-networkd[1522]: lo: Gained carrier Sep 12 17:54:03.503986 systemd-resolved[1523]: Defaulting to hostname 'linux'. Sep 12 17:54:03.506784 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:54:03.506934 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:54:03.508079 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:54:03.508232 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:54:03.508348 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:54:03.508449 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 17:54:03.508615 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:54:03.508742 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:54:03.508846 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:54:03.508943 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:54:03.508958 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:54:03.509042 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:54:03.509285 systemd-networkd[1522]: Enumeration completed Sep 12 17:54:03.509582 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:54:03.509883 systemd-networkd[1522]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Sep 12 17:54:03.510936 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:54:03.513048 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 12 17:54:03.513168 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 12 17:54:03.515079 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 17:54:03.515267 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 17:54:03.515375 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 17:54:03.516052 systemd-networkd[1522]: ens192: Link UP Sep 12 17:54:03.516196 systemd-networkd[1522]: ens192: Gained carrier Sep 12 17:54:03.518953 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:54:03.519219 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 17:54:03.519658 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:54:03.519829 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:54:03.524440 systemd-timesyncd[1541]: Network configuration changed, trying to establish connection. Sep 12 17:54:03.524677 systemd[1]: Reached target network.target - Network. Sep 12 17:54:03.525091 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:54:03.525187 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:54:03.525306 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:54:03.525324 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:54:03.527090 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:54:03.528140 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:54:03.530141 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:54:03.535123 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:54:03.538013 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:54:03.538146 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:54:03.540654 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 17:54:03.545996 (udev-worker)[1431]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 12 17:54:03.548639 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:54:03.550170 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:54:03.554411 jq[1596]: false Sep 12 17:54:03.554616 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:54:03.557242 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:54:03.562114 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:54:03.563865 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 17:54:03.567214 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:54:03.568538 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Refreshing passwd entry cache Sep 12 17:54:03.568539 oslogin_cache_refresh[1598]: Refreshing passwd entry cache Sep 12 17:54:03.570817 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:54:03.571390 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 17:54:03.572696 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:54:03.575344 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:54:03.578436 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Failure getting users, quitting Sep 12 17:54:03.578436 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:54:03.578426 oslogin_cache_refresh[1598]: Failure getting users, quitting Sep 12 17:54:03.578509 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Refreshing group entry cache Sep 12 17:54:03.578437 oslogin_cache_refresh[1598]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:54:03.578461 oslogin_cache_refresh[1598]: Refreshing group entry cache Sep 12 17:54:03.578750 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:54:03.582122 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Sep 12 17:54:03.587089 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Failure getting groups, quitting Sep 12 17:54:03.587089 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:54:03.586133 oslogin_cache_refresh[1598]: Failure getting groups, quitting Sep 12 17:54:03.586139 oslogin_cache_refresh[1598]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:54:03.588422 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:54:03.588686 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:54:03.589082 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:54:03.589232 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 17:54:03.589362 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 17:54:03.591970 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:54:03.592124 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:54:03.597777 update_engine[1609]: I20250912 17:54:03.597738 1609 main.cc:92] Flatcar Update Engine starting Sep 12 17:54:03.600855 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:54:03.601983 extend-filesystems[1597]: Found /dev/sda6 Sep 12 17:54:03.603905 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:54:03.607220 extend-filesystems[1597]: Found /dev/sda9 Sep 12 17:54:03.613995 extend-filesystems[1597]: Checking size of /dev/sda9 Sep 12 17:54:03.614573 jq[1613]: true Sep 12 17:54:03.627302 (ntainerd)[1635]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:54:03.626477 dbus-daemon[1594]: [system] SELinux support is enabled Sep 12 17:54:03.627605 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:54:03.628935 update_engine[1609]: I20250912 17:54:03.628803 1609 update_check_scheduler.cc:74] Next update check in 2m52s Sep 12 17:54:03.631119 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 17:54:03.631620 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:54:03.631637 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:54:03.632477 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:54:03.632487 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:54:03.636403 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:54:03.639252 extend-filesystems[1597]: Old size kept for /dev/sda9 Sep 12 17:54:03.647530 jq[1638]: true Sep 12 17:54:03.648026 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:54:03.648775 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:54:03.649163 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:54:03.749905 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Sep 12 17:54:03.757049 tar[1620]: linux-amd64/LICENSE Sep 12 17:54:03.758903 tar[1620]: linux-amd64/helm Sep 12 17:54:03.761535 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Sep 12 17:54:03.762250 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:54:03.804844 systemd-logind[1603]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 17:54:03.805875 systemd-logind[1603]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 17:54:03.809101 systemd-logind[1603]: New seat seat0. Sep 12 17:54:03.811756 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:54:03.830692 containerd[1635]: time="2025-09-12T17:54:03Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 17:54:03.830825 containerd[1635]: time="2025-09-12T17:54:03.830779849Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 17:54:03.834975 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Sep 12 17:54:03.840019 containerd[1635]: time="2025-09-12T17:54:03.839998839Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.342µs" Sep 12 17:54:03.840019 containerd[1635]: time="2025-09-12T17:54:03.840016692Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 17:54:03.840158 containerd[1635]: time="2025-09-12T17:54:03.840027925Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 17:54:03.840158 containerd[1635]: time="2025-09-12T17:54:03.840119708Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 17:54:03.840158 containerd[1635]: time="2025-09-12T17:54:03.840129042Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 17:54:03.840216 containerd[1635]: time="2025-09-12T17:54:03.840165572Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:54:03.840231 containerd[1635]: time="2025-09-12T17:54:03.840214400Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:54:03.840231 containerd[1635]: time="2025-09-12T17:54:03.840221465Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840349882Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840360473Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840367261Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840371616Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840413398Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840520199Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840535778Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840541290Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840557447Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840667419Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 17:54:03.841934 containerd[1635]: time="2025-09-12T17:54:03.840695997Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:54:03.846434 unknown[1654]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Sep 12 17:54:03.849212 locksmithd[1644]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:54:03.856288 unknown[1654]: Core dump limit set to -1 Sep 12 17:54:03.866790 containerd[1635]: time="2025-09-12T17:54:03.866767270Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 17:54:03.867383 containerd[1635]: time="2025-09-12T17:54:03.867295379Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 17:54:03.867383 containerd[1635]: time="2025-09-12T17:54:03.867311883Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 17:54:03.867383 containerd[1635]: time="2025-09-12T17:54:03.867320966Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 17:54:03.867383 containerd[1635]: time="2025-09-12T17:54:03.867328544Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 17:54:03.867383 containerd[1635]: time="2025-09-12T17:54:03.867334744Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 17:54:03.867608 containerd[1635]: time="2025-09-12T17:54:03.867529752Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 17:54:03.867914 containerd[1635]: time="2025-09-12T17:54:03.867850970Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 17:54:03.867914 containerd[1635]: time="2025-09-12T17:54:03.867864201Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 17:54:03.867914 containerd[1635]: time="2025-09-12T17:54:03.867870813Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 17:54:03.867914 containerd[1635]: time="2025-09-12T17:54:03.867876454Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 17:54:03.867914 containerd[1635]: time="2025-09-12T17:54:03.867884618Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868221130Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868237414Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868257377Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868264941Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868270900Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868277592Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868285367Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868291622Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868297542Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868303009Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 17:54:03.868339 containerd[1635]: time="2025-09-12T17:54:03.868308568Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 17:54:03.868673 containerd[1635]: time="2025-09-12T17:54:03.868663837Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 17:54:03.869312 containerd[1635]: time="2025-09-12T17:54:03.868730623Z" level=info msg="Start snapshots syncer" Sep 12 17:54:03.869312 containerd[1635]: time="2025-09-12T17:54:03.868750737Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 17:54:03.869312 containerd[1635]: time="2025-09-12T17:54:03.868904467Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 17:54:03.869412 containerd[1635]: time="2025-09-12T17:54:03.868933752Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 17:54:03.871024 containerd[1635]: time="2025-09-12T17:54:03.871011283Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 17:54:03.871147 containerd[1635]: time="2025-09-12T17:54:03.871136873Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 17:54:03.871348 containerd[1635]: time="2025-09-12T17:54:03.871337272Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 17:54:03.871400 containerd[1635]: time="2025-09-12T17:54:03.871386906Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 17:54:03.871439 containerd[1635]: time="2025-09-12T17:54:03.871431803Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 17:54:03.871594 containerd[1635]: time="2025-09-12T17:54:03.871584384Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 17:54:03.871635 containerd[1635]: time="2025-09-12T17:54:03.871627502Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 17:54:03.871754 containerd[1635]: time="2025-09-12T17:54:03.871669312Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 17:54:03.871803 containerd[1635]: time="2025-09-12T17:54:03.871795122Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 17:54:03.871834 containerd[1635]: time="2025-09-12T17:54:03.871828382Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 17:54:03.872010 containerd[1635]: time="2025-09-12T17:54:03.871893226Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 17:54:03.872111 containerd[1635]: time="2025-09-12T17:54:03.872068844Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:54:03.872111 containerd[1635]: time="2025-09-12T17:54:03.872082810Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:54:03.872111 containerd[1635]: time="2025-09-12T17:54:03.872088368Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:54:03.872111 containerd[1635]: time="2025-09-12T17:54:03.872094173Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:54:03.872111 containerd[1635]: time="2025-09-12T17:54:03.872098643Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 17:54:03.872464 containerd[1635]: time="2025-09-12T17:54:03.872103594Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 17:54:03.872464 containerd[1635]: time="2025-09-12T17:54:03.872403875Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 17:54:03.872464 containerd[1635]: time="2025-09-12T17:54:03.872423118Z" level=info msg="runtime interface created" Sep 12 17:54:03.872464 containerd[1635]: time="2025-09-12T17:54:03.872432109Z" level=info msg="created NRI interface" Sep 12 17:54:03.872464 containerd[1635]: time="2025-09-12T17:54:03.872437516Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 17:54:03.872464 containerd[1635]: time="2025-09-12T17:54:03.872445108Z" level=info msg="Connect containerd service" Sep 12 17:54:03.872740 containerd[1635]: time="2025-09-12T17:54:03.872667261Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:54:03.875425 containerd[1635]: time="2025-09-12T17:54:03.875266370Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:54:03.880219 bash[1683]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:54:03.883561 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:54:03.884219 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 17:54:03.918640 sshd_keygen[1631]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:54:03.955940 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:54:03.958159 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:54:03.973856 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:54:03.973996 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:54:03.976157 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:54:04.004977 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:54:04.009254 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:54:04.010417 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 17:54:04.010597 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:54:04.056068 containerd[1635]: time="2025-09-12T17:54:04.055917031Z" level=info msg="Start subscribing containerd event" Sep 12 17:54:04.056068 containerd[1635]: time="2025-09-12T17:54:04.055954869Z" level=info msg="Start recovering state" Sep 12 17:54:04.056068 containerd[1635]: time="2025-09-12T17:54:04.056033575Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056181566Z" level=info msg="Start event monitor" Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056192980Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056197489Z" level=info msg="Start streaming server" Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056203562Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056198576Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056210341Z" level=info msg="runtime interface starting up..." Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056225514Z" level=info msg="starting plugins..." Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056235778Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 17:54:04.056710 containerd[1635]: time="2025-09-12T17:54:04.056507773Z" level=info msg="containerd successfully booted in 0.226196s" Sep 12 17:54:04.056374 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:54:04.094949 tar[1620]: linux-amd64/README.md Sep 12 17:54:04.106201 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:54:04.796321 systemd-networkd[1522]: ens192: Gained IPv6LL Sep 12 17:54:04.796698 systemd-timesyncd[1541]: Network configuration changed, trying to establish connection. Sep 12 17:54:04.797682 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:54:04.798506 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:54:04.799795 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Sep 12 17:54:04.801338 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:54:04.803235 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:54:04.829345 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:54:04.837892 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 17:54:04.838034 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Sep 12 17:54:04.838695 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:54:06.069948 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:54:06.070307 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:54:06.070860 systemd[1]: Startup finished in 2.637s (kernel) + 5.313s (initrd) + 4.200s (userspace) = 12.150s. Sep 12 17:54:06.074636 (kubelet)[1800]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:54:06.100464 login[1711]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 17:54:06.102323 login[1712]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 17:54:06.110208 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:54:06.111151 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:54:06.114837 systemd-logind[1603]: New session 1 of user core. Sep 12 17:54:06.117713 systemd-logind[1603]: New session 2 of user core. Sep 12 17:54:06.125921 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:54:06.127532 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:54:06.139616 (systemd)[1807]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:54:06.141273 systemd-logind[1603]: New session c1 of user core. Sep 12 17:54:06.292403 systemd[1807]: Queued start job for default target default.target. Sep 12 17:54:06.298818 systemd[1807]: Created slice app.slice - User Application Slice. Sep 12 17:54:06.298837 systemd[1807]: Reached target paths.target - Paths. Sep 12 17:54:06.298910 systemd[1807]: Reached target timers.target - Timers. Sep 12 17:54:06.299593 systemd[1807]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:54:06.310357 systemd[1807]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:54:06.310417 systemd[1807]: Reached target sockets.target - Sockets. Sep 12 17:54:06.310438 systemd[1807]: Reached target basic.target - Basic System. Sep 12 17:54:06.310461 systemd[1807]: Reached target default.target - Main User Target. Sep 12 17:54:06.310475 systemd[1807]: Startup finished in 165ms. Sep 12 17:54:06.310534 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:54:06.311678 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:54:06.312915 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:54:06.451455 systemd-timesyncd[1541]: Network configuration changed, trying to establish connection. Sep 12 17:54:06.582587 kubelet[1800]: E0912 17:54:06.582558 1800 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:54:06.583975 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:54:06.584078 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:54:06.584415 systemd[1]: kubelet.service: Consumed 634ms CPU time, 261.9M memory peak. Sep 12 17:54:16.834491 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:54:16.835964 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:54:17.186326 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:54:17.199335 (kubelet)[1852]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:54:17.292595 kubelet[1852]: E0912 17:54:17.292557 1852 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:54:17.295673 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:54:17.295858 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:54:17.296344 systemd[1]: kubelet.service: Consumed 110ms CPU time, 110.7M memory peak. Sep 12 17:54:27.546213 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:54:27.547648 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:54:27.861969 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:54:27.864370 (kubelet)[1866]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:54:27.906494 kubelet[1866]: E0912 17:54:27.906457 1866 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:54:27.907984 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:54:27.908159 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:54:27.908557 systemd[1]: kubelet.service: Consumed 87ms CPU time, 108.3M memory peak. Sep 12 17:54:33.980244 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:54:33.981446 systemd[1]: Started sshd@0-139.178.70.102:22-139.178.89.65:51202.service - OpenSSH per-connection server daemon (139.178.89.65:51202). Sep 12 17:54:34.077786 sshd[1874]: Accepted publickey for core from 139.178.89.65 port 51202 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:54:34.078135 sshd-session[1874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:34.081801 systemd-logind[1603]: New session 3 of user core. Sep 12 17:54:34.091234 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:54:34.144076 systemd[1]: Started sshd@1-139.178.70.102:22-139.178.89.65:51208.service - OpenSSH per-connection server daemon (139.178.89.65:51208). Sep 12 17:54:34.189413 sshd[1880]: Accepted publickey for core from 139.178.89.65 port 51208 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:54:34.190833 sshd-session[1880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:34.195093 systemd-logind[1603]: New session 4 of user core. Sep 12 17:54:34.198198 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:54:34.246663 sshd[1883]: Connection closed by 139.178.89.65 port 51208 Sep 12 17:54:34.248337 sshd-session[1880]: pam_unix(sshd:session): session closed for user core Sep 12 17:54:34.252335 systemd[1]: sshd@1-139.178.70.102:22-139.178.89.65:51208.service: Deactivated successfully. Sep 12 17:54:34.253317 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:54:34.253839 systemd-logind[1603]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:54:34.257243 systemd[1]: Started sshd@2-139.178.70.102:22-139.178.89.65:51218.service - OpenSSH per-connection server daemon (139.178.89.65:51218). Sep 12 17:54:34.258010 systemd-logind[1603]: Removed session 4. Sep 12 17:54:34.293252 sshd[1889]: Accepted publickey for core from 139.178.89.65 port 51218 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:54:34.294139 sshd-session[1889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:34.297580 systemd-logind[1603]: New session 5 of user core. Sep 12 17:54:34.308241 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:54:34.355710 sshd[1892]: Connection closed by 139.178.89.65 port 51218 Sep 12 17:54:34.356644 sshd-session[1889]: pam_unix(sshd:session): session closed for user core Sep 12 17:54:34.365698 systemd[1]: sshd@2-139.178.70.102:22-139.178.89.65:51218.service: Deactivated successfully. Sep 12 17:54:34.366945 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:54:34.367540 systemd-logind[1603]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:54:34.371283 systemd[1]: Started sshd@3-139.178.70.102:22-139.178.89.65:51232.service - OpenSSH per-connection server daemon (139.178.89.65:51232). Sep 12 17:54:34.372028 systemd-logind[1603]: Removed session 5. Sep 12 17:54:34.410852 sshd[1898]: Accepted publickey for core from 139.178.89.65 port 51232 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:54:34.411668 sshd-session[1898]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:34.414576 systemd-logind[1603]: New session 6 of user core. Sep 12 17:54:34.426203 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:54:34.474287 sshd[1901]: Connection closed by 139.178.89.65 port 51232 Sep 12 17:54:34.474657 sshd-session[1898]: pam_unix(sshd:session): session closed for user core Sep 12 17:54:34.490417 systemd[1]: sshd@3-139.178.70.102:22-139.178.89.65:51232.service: Deactivated successfully. Sep 12 17:54:34.491669 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:54:34.492313 systemd-logind[1603]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:54:34.493714 systemd[1]: Started sshd@4-139.178.70.102:22-139.178.89.65:51242.service - OpenSSH per-connection server daemon (139.178.89.65:51242). Sep 12 17:54:34.495464 systemd-logind[1603]: Removed session 6. Sep 12 17:54:34.531389 sshd[1907]: Accepted publickey for core from 139.178.89.65 port 51242 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:54:34.532180 sshd-session[1907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:34.534647 systemd-logind[1603]: New session 7 of user core. Sep 12 17:54:34.549346 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:54:34.650339 sudo[1911]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:54:34.650526 sudo[1911]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:54:34.665422 sudo[1911]: pam_unix(sudo:session): session closed for user root Sep 12 17:54:34.666351 sshd[1910]: Connection closed by 139.178.89.65 port 51242 Sep 12 17:54:34.666794 sshd-session[1907]: pam_unix(sshd:session): session closed for user core Sep 12 17:54:34.675326 systemd[1]: sshd@4-139.178.70.102:22-139.178.89.65:51242.service: Deactivated successfully. Sep 12 17:54:34.676300 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:54:34.677183 systemd-logind[1603]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:54:34.678678 systemd[1]: Started sshd@5-139.178.70.102:22-139.178.89.65:51246.service - OpenSSH per-connection server daemon (139.178.89.65:51246). Sep 12 17:54:34.680328 systemd-logind[1603]: Removed session 7. Sep 12 17:54:34.722495 sshd[1917]: Accepted publickey for core from 139.178.89.65 port 51246 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:54:34.723436 sshd-session[1917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:34.726568 systemd-logind[1603]: New session 8 of user core. Sep 12 17:54:34.735144 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:54:34.784717 sudo[1922]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:54:34.784920 sudo[1922]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:54:34.792118 sudo[1922]: pam_unix(sudo:session): session closed for user root Sep 12 17:54:34.795454 sudo[1921]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 17:54:34.795783 sudo[1921]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:54:34.802344 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:54:34.825773 augenrules[1944]: No rules Sep 12 17:54:34.826433 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:54:34.826699 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:54:34.828141 sudo[1921]: pam_unix(sudo:session): session closed for user root Sep 12 17:54:34.828907 sshd[1920]: Connection closed by 139.178.89.65 port 51246 Sep 12 17:54:34.829188 sshd-session[1917]: pam_unix(sshd:session): session closed for user core Sep 12 17:54:34.843467 systemd[1]: sshd@5-139.178.70.102:22-139.178.89.65:51246.service: Deactivated successfully. Sep 12 17:54:34.844946 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:54:34.845641 systemd-logind[1603]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:54:34.846860 systemd-logind[1603]: Removed session 8. Sep 12 17:54:34.847950 systemd[1]: Started sshd@6-139.178.70.102:22-139.178.89.65:51262.service - OpenSSH per-connection server daemon (139.178.89.65:51262). Sep 12 17:54:34.888771 sshd[1953]: Accepted publickey for core from 139.178.89.65 port 51262 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:54:34.889653 sshd-session[1953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:34.893719 systemd-logind[1603]: New session 9 of user core. Sep 12 17:54:34.900206 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:54:34.949636 sudo[1957]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:54:34.949805 sudo[1957]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:54:35.432090 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:54:35.444460 (dockerd)[1974]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:54:35.848099 dockerd[1974]: time="2025-09-12T17:54:35.848012699Z" level=info msg="Starting up" Sep 12 17:54:35.849751 dockerd[1974]: time="2025-09-12T17:54:35.849741011Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 17:54:35.855648 dockerd[1974]: time="2025-09-12T17:54:35.855623840Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 17:54:35.954863 dockerd[1974]: time="2025-09-12T17:54:35.954829313Z" level=info msg="Loading containers: start." Sep 12 17:54:35.966062 kernel: Initializing XFRM netlink socket Sep 12 17:54:36.211835 systemd-timesyncd[1541]: Network configuration changed, trying to establish connection. Sep 12 17:54:36.244975 systemd-networkd[1522]: docker0: Link UP Sep 12 17:54:36.280491 dockerd[1974]: time="2025-09-12T17:54:36.280462545Z" level=info msg="Loading containers: done." Sep 12 17:54:36.289820 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2810704757-merged.mount: Deactivated successfully. Sep 12 17:54:36.315697 dockerd[1974]: time="2025-09-12T17:54:36.315657875Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:54:36.315786 dockerd[1974]: time="2025-09-12T17:54:36.315730176Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 17:54:36.315805 dockerd[1974]: time="2025-09-12T17:54:36.315791750Z" level=info msg="Initializing buildkit" Sep 12 17:54:36.351863 dockerd[1974]: time="2025-09-12T17:54:36.351830507Z" level=info msg="Completed buildkit initialization" Sep 12 17:54:36.357504 dockerd[1974]: time="2025-09-12T17:54:36.357464967Z" level=info msg="Daemon has completed initialization" Sep 12 17:54:36.358243 dockerd[1974]: time="2025-09-12T17:54:36.357547274Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:54:36.358357 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:56:09.854666 systemd-resolved[1523]: Clock change detected. Flushing caches. Sep 12 17:56:09.855798 systemd-timesyncd[1541]: Contacted time server 207.244.103.95:123 (2.flatcar.pool.ntp.org). Sep 12 17:56:09.855952 systemd-timesyncd[1541]: Initial clock synchronization to Fri 2025-09-12 17:56:09.854513 UTC. Sep 12 17:56:11.517457 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 17:56:11.518596 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:56:11.934492 containerd[1635]: time="2025-09-12T17:56:11.934388696Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 17:56:12.155911 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:56:12.160421 (kubelet)[2194]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:56:12.220668 kubelet[2194]: E0912 17:56:12.220600 2194 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:56:12.222324 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:56:12.222422 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:56:12.222643 systemd[1]: kubelet.service: Consumed 112ms CPU time, 111.1M memory peak. Sep 12 17:56:13.182229 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2348409255.mount: Deactivated successfully. Sep 12 17:56:14.665115 containerd[1635]: time="2025-09-12T17:56:14.665079297Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:14.669342 containerd[1635]: time="2025-09-12T17:56:14.669317585Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28837916" Sep 12 17:56:14.673993 containerd[1635]: time="2025-09-12T17:56:14.673967667Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:14.678556 containerd[1635]: time="2025-09-12T17:56:14.678530793Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:14.679418 containerd[1635]: time="2025-09-12T17:56:14.679269915Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 2.744849845s" Sep 12 17:56:14.679418 containerd[1635]: time="2025-09-12T17:56:14.679310833Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Sep 12 17:56:14.679927 containerd[1635]: time="2025-09-12T17:56:14.679808140Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 17:56:16.967043 containerd[1635]: time="2025-09-12T17:56:16.966584590Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:16.971326 containerd[1635]: time="2025-09-12T17:56:16.971292245Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787027" Sep 12 17:56:16.986742 containerd[1635]: time="2025-09-12T17:56:16.986679320Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:16.995340 containerd[1635]: time="2025-09-12T17:56:16.995291547Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:16.995908 containerd[1635]: time="2025-09-12T17:56:16.995839228Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 2.316009892s" Sep 12 17:56:16.995908 containerd[1635]: time="2025-09-12T17:56:16.995857609Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Sep 12 17:56:16.996239 containerd[1635]: time="2025-09-12T17:56:16.996221632Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 17:56:19.006850 containerd[1635]: time="2025-09-12T17:56:19.006198565Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:19.015992 containerd[1635]: time="2025-09-12T17:56:19.015965487Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176289" Sep 12 17:56:19.024969 containerd[1635]: time="2025-09-12T17:56:19.024940839Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:19.035936 containerd[1635]: time="2025-09-12T17:56:19.035907622Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:19.036354 containerd[1635]: time="2025-09-12T17:56:19.036340762Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 2.039876863s" Sep 12 17:56:19.036407 containerd[1635]: time="2025-09-12T17:56:19.036399669Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Sep 12 17:56:19.036901 containerd[1635]: time="2025-09-12T17:56:19.036890369Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 17:56:20.317964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3239381415.mount: Deactivated successfully. Sep 12 17:56:20.990458 containerd[1635]: time="2025-09-12T17:56:20.990406503Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:21.003412 containerd[1635]: time="2025-09-12T17:56:21.003377505Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924206" Sep 12 17:56:21.015694 containerd[1635]: time="2025-09-12T17:56:21.015666483Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:21.021961 containerd[1635]: time="2025-09-12T17:56:21.021935977Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:21.022670 containerd[1635]: time="2025-09-12T17:56:21.022646052Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 1.985692698s" Sep 12 17:56:21.022701 containerd[1635]: time="2025-09-12T17:56:21.022673376Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Sep 12 17:56:21.023110 containerd[1635]: time="2025-09-12T17:56:21.022950809Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 17:56:21.852646 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount552168931.mount: Deactivated successfully. Sep 12 17:56:22.291475 update_engine[1609]: I20250912 17:56:22.291039 1609 update_attempter.cc:509] Updating boot flags... Sep 12 17:56:22.312685 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 12 17:56:22.316129 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:56:22.750720 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:56:22.757214 (kubelet)[2355]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:56:22.784306 kubelet[2355]: E0912 17:56:22.784273 2355 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:56:22.786185 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:56:22.786273 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:56:22.787418 systemd[1]: kubelet.service: Consumed 98ms CPU time, 110.2M memory peak. Sep 12 17:56:22.941403 containerd[1635]: time="2025-09-12T17:56:22.940840658Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:22.948358 containerd[1635]: time="2025-09-12T17:56:22.948324296Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 12 17:56:22.958509 containerd[1635]: time="2025-09-12T17:56:22.958473071Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:22.971275 containerd[1635]: time="2025-09-12T17:56:22.971235209Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:22.971867 containerd[1635]: time="2025-09-12T17:56:22.971845930Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.948876373s" Sep 12 17:56:22.971934 containerd[1635]: time="2025-09-12T17:56:22.971922145Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 17:56:22.972394 containerd[1635]: time="2025-09-12T17:56:22.972377374Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:56:23.615104 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount734400601.mount: Deactivated successfully. Sep 12 17:56:23.647132 containerd[1635]: time="2025-09-12T17:56:23.647079657Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:56:23.651772 containerd[1635]: time="2025-09-12T17:56:23.651726264Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 17:56:23.656317 containerd[1635]: time="2025-09-12T17:56:23.656288098Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:56:23.661116 containerd[1635]: time="2025-09-12T17:56:23.661069278Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:56:23.661532 containerd[1635]: time="2025-09-12T17:56:23.661441926Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 689.046268ms" Sep 12 17:56:23.661532 containerd[1635]: time="2025-09-12T17:56:23.661461187Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 17:56:23.661816 containerd[1635]: time="2025-09-12T17:56:23.661806662Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 17:56:24.322966 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3133479621.mount: Deactivated successfully. Sep 12 17:56:27.209301 containerd[1635]: time="2025-09-12T17:56:27.208878684Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:27.209751 containerd[1635]: time="2025-09-12T17:56:27.209578171Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 12 17:56:27.210323 containerd[1635]: time="2025-09-12T17:56:27.210063657Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:27.212019 containerd[1635]: time="2025-09-12T17:56:27.211992354Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:27.212928 containerd[1635]: time="2025-09-12T17:56:27.212908971Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.551035333s" Sep 12 17:56:27.212974 containerd[1635]: time="2025-09-12T17:56:27.212931943Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 12 17:56:29.326221 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:56:29.326335 systemd[1]: kubelet.service: Consumed 98ms CPU time, 110.2M memory peak. Sep 12 17:56:29.328544 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:56:29.350763 systemd[1]: Reload requested from client PID 2448 ('systemctl') (unit session-9.scope)... Sep 12 17:56:29.350845 systemd[1]: Reloading... Sep 12 17:56:29.418068 zram_generator::config[2491]: No configuration found. Sep 12 17:56:29.514397 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 17:56:29.582908 systemd[1]: Reloading finished in 231 ms. Sep 12 17:56:29.609949 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 17:56:29.610080 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 17:56:29.610328 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:56:29.611971 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:56:30.347153 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:56:30.354235 (kubelet)[2559]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:56:30.455034 kubelet[2559]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:56:30.455034 kubelet[2559]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:56:30.455034 kubelet[2559]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:56:30.455310 kubelet[2559]: I0912 17:56:30.455108 2559 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:56:30.834445 kubelet[2559]: I0912 17:56:30.834416 2559 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 17:56:30.834445 kubelet[2559]: I0912 17:56:30.834439 2559 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:56:30.834622 kubelet[2559]: I0912 17:56:30.834608 2559 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 17:56:31.230188 kubelet[2559]: E0912 17:56:31.229614 2559 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.102:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:31.240102 kubelet[2559]: I0912 17:56:31.240077 2559 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:56:31.248209 kubelet[2559]: I0912 17:56:31.248117 2559 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:56:31.253825 kubelet[2559]: I0912 17:56:31.253654 2559 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:56:31.255250 kubelet[2559]: I0912 17:56:31.255224 2559 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:56:31.255373 kubelet[2559]: I0912 17:56:31.255246 2559 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:56:31.256829 kubelet[2559]: I0912 17:56:31.256811 2559 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:56:31.256829 kubelet[2559]: I0912 17:56:31.256825 2559 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 17:56:31.257777 kubelet[2559]: I0912 17:56:31.257759 2559 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:56:31.262729 kubelet[2559]: I0912 17:56:31.262449 2559 kubelet.go:446] "Attempting to sync node with API server" Sep 12 17:56:31.262729 kubelet[2559]: I0912 17:56:31.262473 2559 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:56:31.264431 kubelet[2559]: I0912 17:56:31.264408 2559 kubelet.go:352] "Adding apiserver pod source" Sep 12 17:56:31.264431 kubelet[2559]: I0912 17:56:31.264430 2559 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:56:31.267343 kubelet[2559]: W0912 17:56:31.267155 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Sep 12 17:56:31.267343 kubelet[2559]: E0912 17:56:31.267194 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:31.267596 kubelet[2559]: W0912 17:56:31.267574 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Sep 12 17:56:31.267663 kubelet[2559]: E0912 17:56:31.267653 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:31.268851 kubelet[2559]: I0912 17:56:31.268737 2559 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:56:31.271407 kubelet[2559]: I0912 17:56:31.271296 2559 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:56:31.272727 kubelet[2559]: W0912 17:56:31.271857 2559 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:56:31.272727 kubelet[2559]: I0912 17:56:31.272556 2559 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:56:31.272727 kubelet[2559]: I0912 17:56:31.272582 2559 server.go:1287] "Started kubelet" Sep 12 17:56:31.276483 kubelet[2559]: I0912 17:56:31.276460 2559 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:56:31.286612 kubelet[2559]: I0912 17:56:31.286555 2559 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:56:31.286814 kubelet[2559]: I0912 17:56:31.286800 2559 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:56:31.289282 kubelet[2559]: I0912 17:56:31.288609 2559 server.go:479] "Adding debug handlers to kubelet server" Sep 12 17:56:31.298899 kubelet[2559]: E0912 17:56:31.289861 2559 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.102:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.102:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18649aa5c0946834 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 17:56:31.272568884 +0000 UTC m=+0.915784506,LastTimestamp:2025-09-12 17:56:31.272568884 +0000 UTC m=+0.915784506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 17:56:31.303605 kubelet[2559]: I0912 17:56:31.303589 2559 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:56:31.308073 kubelet[2559]: I0912 17:56:31.306156 2559 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:56:31.308160 kubelet[2559]: E0912 17:56:31.308145 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:31.308238 kubelet[2559]: I0912 17:56:31.308230 2559 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:56:31.308395 kubelet[2559]: I0912 17:56:31.308388 2559 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:56:31.308468 kubelet[2559]: I0912 17:56:31.308462 2559 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:56:31.308839 kubelet[2559]: W0912 17:56:31.308810 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Sep 12 17:56:31.308895 kubelet[2559]: E0912 17:56:31.308885 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:31.309100 kubelet[2559]: E0912 17:56:31.309087 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="200ms" Sep 12 17:56:31.330561 kubelet[2559]: I0912 17:56:31.330542 2559 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:56:31.336382 kubelet[2559]: I0912 17:56:31.336360 2559 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:56:31.337939 kubelet[2559]: I0912 17:56:31.337924 2559 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:56:31.337939 kubelet[2559]: I0912 17:56:31.337938 2559 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 17:56:31.337993 kubelet[2559]: I0912 17:56:31.337949 2559 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:56:31.337993 kubelet[2559]: I0912 17:56:31.337953 2559 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 17:56:31.337993 kubelet[2559]: E0912 17:56:31.337976 2559 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:56:31.338732 kubelet[2559]: W0912 17:56:31.338272 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Sep 12 17:56:31.338732 kubelet[2559]: E0912 17:56:31.338299 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:31.340177 kubelet[2559]: I0912 17:56:31.340139 2559 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:56:31.340223 kubelet[2559]: I0912 17:56:31.340216 2559 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:56:31.344363 kubelet[2559]: E0912 17:56:31.344342 2559 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:56:31.348387 kubelet[2559]: I0912 17:56:31.348329 2559 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:56:31.348387 kubelet[2559]: I0912 17:56:31.348338 2559 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:56:31.348387 kubelet[2559]: I0912 17:56:31.348346 2559 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:56:31.352176 kubelet[2559]: I0912 17:56:31.352164 2559 policy_none.go:49] "None policy: Start" Sep 12 17:56:31.352176 kubelet[2559]: I0912 17:56:31.352176 2559 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:56:31.352237 kubelet[2559]: I0912 17:56:31.352189 2559 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:56:31.359500 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:56:31.369526 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:56:31.372282 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:56:31.381760 kubelet[2559]: I0912 17:56:31.381744 2559 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:56:31.381957 kubelet[2559]: I0912 17:56:31.381949 2559 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:56:31.382039 kubelet[2559]: I0912 17:56:31.382019 2559 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:56:31.382223 kubelet[2559]: I0912 17:56:31.382213 2559 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:56:31.382889 kubelet[2559]: E0912 17:56:31.382873 2559 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:56:31.382993 kubelet[2559]: E0912 17:56:31.382959 2559 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 17:56:31.444181 systemd[1]: Created slice kubepods-burstable-podbf8bbbbf2b1970570cb10c6a565ab7d2.slice - libcontainer container kubepods-burstable-podbf8bbbbf2b1970570cb10c6a565ab7d2.slice. Sep 12 17:56:31.452483 kubelet[2559]: E0912 17:56:31.452460 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 17:56:31.455194 systemd[1]: Created slice kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice - libcontainer container kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice. Sep 12 17:56:31.456405 kubelet[2559]: E0912 17:56:31.456394 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 17:56:31.465434 systemd[1]: Created slice kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice - libcontainer container kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice. Sep 12 17:56:31.466945 kubelet[2559]: E0912 17:56:31.466930 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 17:56:31.484419 kubelet[2559]: I0912 17:56:31.484349 2559 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 17:56:31.485969 kubelet[2559]: E0912 17:56:31.485949 2559 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Sep 12 17:56:31.509642 kubelet[2559]: E0912 17:56:31.509487 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="400ms" Sep 12 17:56:31.509642 kubelet[2559]: I0912 17:56:31.509549 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:31.509642 kubelet[2559]: I0912 17:56:31.509562 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:31.509642 kubelet[2559]: I0912 17:56:31.509571 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:31.509642 kubelet[2559]: I0912 17:56:31.509581 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bf8bbbbf2b1970570cb10c6a565ab7d2-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"bf8bbbbf2b1970570cb10c6a565ab7d2\") " pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:31.509813 kubelet[2559]: I0912 17:56:31.509590 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bf8bbbbf2b1970570cb10c6a565ab7d2-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"bf8bbbbf2b1970570cb10c6a565ab7d2\") " pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:31.509813 kubelet[2559]: I0912 17:56:31.509598 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bf8bbbbf2b1970570cb10c6a565ab7d2-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"bf8bbbbf2b1970570cb10c6a565ab7d2\") " pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:31.509813 kubelet[2559]: I0912 17:56:31.509607 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:31.610686 kubelet[2559]: I0912 17:56:31.610184 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 12 17:56:31.610686 kubelet[2559]: I0912 17:56:31.610358 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:31.687134 kubelet[2559]: I0912 17:56:31.687115 2559 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 17:56:31.687530 kubelet[2559]: E0912 17:56:31.687512 2559 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Sep 12 17:56:31.755411 containerd[1635]: time="2025-09-12T17:56:31.755333295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:bf8bbbbf2b1970570cb10c6a565ab7d2,Namespace:kube-system,Attempt:0,}" Sep 12 17:56:31.757207 containerd[1635]: time="2025-09-12T17:56:31.757175341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,}" Sep 12 17:56:31.767875 containerd[1635]: time="2025-09-12T17:56:31.767833553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,}" Sep 12 17:56:31.909956 kubelet[2559]: E0912 17:56:31.909920 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="800ms" Sep 12 17:56:32.089864 kubelet[2559]: I0912 17:56:32.089593 2559 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 17:56:32.089864 kubelet[2559]: E0912 17:56:32.089827 2559 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Sep 12 17:56:32.149774 containerd[1635]: time="2025-09-12T17:56:32.148856254Z" level=info msg="connecting to shim 7f5db924d8afa965b4bb399e39f60023416d13680ec343abfa0fb37b3032f413" address="unix:///run/containerd/s/634be40166970b11aa03b9dcdb6cbe7cd8cc6f0250d07b698ee1d835cd78703e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:56:32.150289 containerd[1635]: time="2025-09-12T17:56:32.150268256Z" level=info msg="connecting to shim 97c35b0f55100642e3336814d796ecf99bc711a6709d66958d2c0ef1f08f0552" address="unix:///run/containerd/s/e7dace45da01be371d1c0371113384ffce0315efb3a25f2d289f39fcc512e063" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:56:32.160730 kubelet[2559]: W0912 17:56:32.160695 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Sep 12 17:56:32.160787 kubelet[2559]: E0912 17:56:32.160736 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:32.186546 containerd[1635]: time="2025-09-12T17:56:32.186194505Z" level=info msg="connecting to shim 9d7c264e06a9ac4845603c0bae797f8cac458348e2c22c9f6396a33b67901c6a" address="unix:///run/containerd/s/68af8a963d33b6b95feefba8eb56b17fe2d7e364aaa766c24418ea7766376a94" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:56:32.270184 systemd[1]: Started cri-containerd-7f5db924d8afa965b4bb399e39f60023416d13680ec343abfa0fb37b3032f413.scope - libcontainer container 7f5db924d8afa965b4bb399e39f60023416d13680ec343abfa0fb37b3032f413. Sep 12 17:56:32.271874 systemd[1]: Started cri-containerd-97c35b0f55100642e3336814d796ecf99bc711a6709d66958d2c0ef1f08f0552.scope - libcontainer container 97c35b0f55100642e3336814d796ecf99bc711a6709d66958d2c0ef1f08f0552. Sep 12 17:56:32.274508 systemd[1]: Started cri-containerd-9d7c264e06a9ac4845603c0bae797f8cac458348e2c22c9f6396a33b67901c6a.scope - libcontainer container 9d7c264e06a9ac4845603c0bae797f8cac458348e2c22c9f6396a33b67901c6a. Sep 12 17:56:32.305168 kubelet[2559]: W0912 17:56:32.305134 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Sep 12 17:56:32.305289 kubelet[2559]: E0912 17:56:32.305279 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:32.314183 kubelet[2559]: W0912 17:56:32.314112 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Sep 12 17:56:32.314183 kubelet[2559]: E0912 17:56:32.314167 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:32.331250 containerd[1635]: time="2025-09-12T17:56:32.331215307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:bf8bbbbf2b1970570cb10c6a565ab7d2,Namespace:kube-system,Attempt:0,} returns sandbox id \"7f5db924d8afa965b4bb399e39f60023416d13680ec343abfa0fb37b3032f413\"" Sep 12 17:56:32.335338 containerd[1635]: time="2025-09-12T17:56:32.335306735Z" level=info msg="CreateContainer within sandbox \"7f5db924d8afa965b4bb399e39f60023416d13680ec343abfa0fb37b3032f413\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:56:32.342053 containerd[1635]: time="2025-09-12T17:56:32.341963875Z" level=info msg="Container 8487103e15f6f4f74ebcd3e51c97bc2132fa5f3cac1ad014e9bbed4f3b22ee38: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:56:32.351792 containerd[1635]: time="2025-09-12T17:56:32.351764144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,} returns sandbox id \"97c35b0f55100642e3336814d796ecf99bc711a6709d66958d2c0ef1f08f0552\"" Sep 12 17:56:32.355504 containerd[1635]: time="2025-09-12T17:56:32.355478888Z" level=info msg="CreateContainer within sandbox \"97c35b0f55100642e3336814d796ecf99bc711a6709d66958d2c0ef1f08f0552\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:56:32.358712 containerd[1635]: time="2025-09-12T17:56:32.358680419Z" level=info msg="Container 88da5ffd80b1aa914850d86b926c171d05f12c9c374e935984f4d57beb2bf3b2: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:56:32.367955 containerd[1635]: time="2025-09-12T17:56:32.367796764Z" level=info msg="CreateContainer within sandbox \"7f5db924d8afa965b4bb399e39f60023416d13680ec343abfa0fb37b3032f413\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"8487103e15f6f4f74ebcd3e51c97bc2132fa5f3cac1ad014e9bbed4f3b22ee38\"" Sep 12 17:56:32.368595 containerd[1635]: time="2025-09-12T17:56:32.368565418Z" level=info msg="StartContainer for \"8487103e15f6f4f74ebcd3e51c97bc2132fa5f3cac1ad014e9bbed4f3b22ee38\"" Sep 12 17:56:32.369977 containerd[1635]: time="2025-09-12T17:56:32.369959509Z" level=info msg="connecting to shim 8487103e15f6f4f74ebcd3e51c97bc2132fa5f3cac1ad014e9bbed4f3b22ee38" address="unix:///run/containerd/s/634be40166970b11aa03b9dcdb6cbe7cd8cc6f0250d07b698ee1d835cd78703e" protocol=ttrpc version=3 Sep 12 17:56:32.370232 containerd[1635]: time="2025-09-12T17:56:32.370211771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,} returns sandbox id \"9d7c264e06a9ac4845603c0bae797f8cac458348e2c22c9f6396a33b67901c6a\"" Sep 12 17:56:32.371517 containerd[1635]: time="2025-09-12T17:56:32.371455890Z" level=info msg="CreateContainer within sandbox \"97c35b0f55100642e3336814d796ecf99bc711a6709d66958d2c0ef1f08f0552\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"88da5ffd80b1aa914850d86b926c171d05f12c9c374e935984f4d57beb2bf3b2\"" Sep 12 17:56:32.371752 containerd[1635]: time="2025-09-12T17:56:32.371701598Z" level=info msg="StartContainer for \"88da5ffd80b1aa914850d86b926c171d05f12c9c374e935984f4d57beb2bf3b2\"" Sep 12 17:56:32.372848 containerd[1635]: time="2025-09-12T17:56:32.372720676Z" level=info msg="connecting to shim 88da5ffd80b1aa914850d86b926c171d05f12c9c374e935984f4d57beb2bf3b2" address="unix:///run/containerd/s/e7dace45da01be371d1c0371113384ffce0315efb3a25f2d289f39fcc512e063" protocol=ttrpc version=3 Sep 12 17:56:32.373038 containerd[1635]: time="2025-09-12T17:56:32.373025923Z" level=info msg="CreateContainer within sandbox \"9d7c264e06a9ac4845603c0bae797f8cac458348e2c22c9f6396a33b67901c6a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:56:32.382726 containerd[1635]: time="2025-09-12T17:56:32.382601248Z" level=info msg="Container f05d2018e86c3563fb79061232c8d65be07f2f59cb2409f91b684099c5c908a8: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:56:32.389187 systemd[1]: Started cri-containerd-8487103e15f6f4f74ebcd3e51c97bc2132fa5f3cac1ad014e9bbed4f3b22ee38.scope - libcontainer container 8487103e15f6f4f74ebcd3e51c97bc2132fa5f3cac1ad014e9bbed4f3b22ee38. Sep 12 17:56:32.390082 containerd[1635]: time="2025-09-12T17:56:32.390057969Z" level=info msg="CreateContainer within sandbox \"9d7c264e06a9ac4845603c0bae797f8cac458348e2c22c9f6396a33b67901c6a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f05d2018e86c3563fb79061232c8d65be07f2f59cb2409f91b684099c5c908a8\"" Sep 12 17:56:32.394594 containerd[1635]: time="2025-09-12T17:56:32.394065415Z" level=info msg="StartContainer for \"f05d2018e86c3563fb79061232c8d65be07f2f59cb2409f91b684099c5c908a8\"" Sep 12 17:56:32.396731 containerd[1635]: time="2025-09-12T17:56:32.396700662Z" level=info msg="connecting to shim f05d2018e86c3563fb79061232c8d65be07f2f59cb2409f91b684099c5c908a8" address="unix:///run/containerd/s/68af8a963d33b6b95feefba8eb56b17fe2d7e364aaa766c24418ea7766376a94" protocol=ttrpc version=3 Sep 12 17:56:32.398165 systemd[1]: Started cri-containerd-88da5ffd80b1aa914850d86b926c171d05f12c9c374e935984f4d57beb2bf3b2.scope - libcontainer container 88da5ffd80b1aa914850d86b926c171d05f12c9c374e935984f4d57beb2bf3b2. Sep 12 17:56:32.416148 systemd[1]: Started cri-containerd-f05d2018e86c3563fb79061232c8d65be07f2f59cb2409f91b684099c5c908a8.scope - libcontainer container f05d2018e86c3563fb79061232c8d65be07f2f59cb2409f91b684099c5c908a8. Sep 12 17:56:32.457730 containerd[1635]: time="2025-09-12T17:56:32.457693282Z" level=info msg="StartContainer for \"8487103e15f6f4f74ebcd3e51c97bc2132fa5f3cac1ad014e9bbed4f3b22ee38\" returns successfully" Sep 12 17:56:32.474045 containerd[1635]: time="2025-09-12T17:56:32.473985925Z" level=info msg="StartContainer for \"88da5ffd80b1aa914850d86b926c171d05f12c9c374e935984f4d57beb2bf3b2\" returns successfully" Sep 12 17:56:32.489363 containerd[1635]: time="2025-09-12T17:56:32.489326732Z" level=info msg="StartContainer for \"f05d2018e86c3563fb79061232c8d65be07f2f59cb2409f91b684099c5c908a8\" returns successfully" Sep 12 17:56:32.710832 kubelet[2559]: E0912 17:56:32.710749 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="1.6s" Sep 12 17:56:32.761458 kubelet[2559]: W0912 17:56:32.761400 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Sep 12 17:56:32.761458 kubelet[2559]: E0912 17:56:32.761442 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:56:32.892258 kubelet[2559]: I0912 17:56:32.892110 2559 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 17:56:32.892402 kubelet[2559]: E0912 17:56:32.892388 2559 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Sep 12 17:56:33.363162 kubelet[2559]: E0912 17:56:33.363143 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 17:56:33.363788 kubelet[2559]: E0912 17:56:33.363751 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 17:56:33.364836 kubelet[2559]: E0912 17:56:33.364824 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 17:56:34.342297 kubelet[2559]: E0912 17:56:34.342265 2559 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 17:56:34.373120 kubelet[2559]: E0912 17:56:34.371928 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 17:56:34.385059 kubelet[2559]: E0912 17:56:34.385033 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 17:56:34.494613 kubelet[2559]: I0912 17:56:34.494361 2559 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 17:56:34.584086 kubelet[2559]: I0912 17:56:34.584063 2559 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 17:56:34.584344 kubelet[2559]: E0912 17:56:34.584231 2559 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 12 17:56:34.596814 kubelet[2559]: E0912 17:56:34.596730 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:34.697241 kubelet[2559]: E0912 17:56:34.697213 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:34.797468 kubelet[2559]: E0912 17:56:34.797433 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:34.898096 kubelet[2559]: E0912 17:56:34.897998 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:34.998731 kubelet[2559]: E0912 17:56:34.998705 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:35.099321 kubelet[2559]: E0912 17:56:35.099294 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:35.200288 kubelet[2559]: E0912 17:56:35.200206 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:35.301276 kubelet[2559]: E0912 17:56:35.301252 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 17:56:35.372639 kubelet[2559]: I0912 17:56:35.372496 2559 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:35.409837 kubelet[2559]: I0912 17:56:35.409625 2559 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:35.444938 kubelet[2559]: I0912 17:56:35.444891 2559 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:35.466388 kubelet[2559]: I0912 17:56:35.466313 2559 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 17:56:35.466673 kubelet[2559]: E0912 17:56:35.466492 2559 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:36.271890 kubelet[2559]: I0912 17:56:36.271857 2559 apiserver.go:52] "Watching apiserver" Sep 12 17:56:36.309472 kubelet[2559]: I0912 17:56:36.309428 2559 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:56:36.623270 systemd[1]: Reload requested from client PID 2826 ('systemctl') (unit session-9.scope)... Sep 12 17:56:36.623279 systemd[1]: Reloading... Sep 12 17:56:36.687027 zram_generator::config[2873]: No configuration found. Sep 12 17:56:36.776575 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 17:56:36.863882 systemd[1]: Reloading finished in 240 ms. Sep 12 17:56:36.888907 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:56:36.905705 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:56:36.905908 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:56:36.905949 systemd[1]: kubelet.service: Consumed 671ms CPU time, 128.6M memory peak. Sep 12 17:56:36.907697 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:56:37.715943 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:56:37.724532 (kubelet)[2937]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:56:37.771656 kubelet[2937]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:56:37.771656 kubelet[2937]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:56:37.771656 kubelet[2937]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:56:37.771987 kubelet[2937]: I0912 17:56:37.771717 2937 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:56:37.784932 kubelet[2937]: I0912 17:56:37.784881 2937 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 17:56:37.784932 kubelet[2937]: I0912 17:56:37.784914 2937 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:56:37.785257 kubelet[2937]: I0912 17:56:37.785242 2937 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 17:56:37.786331 kubelet[2937]: I0912 17:56:37.786309 2937 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 17:56:37.793069 kubelet[2937]: I0912 17:56:37.793037 2937 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:56:37.798260 kubelet[2937]: I0912 17:56:37.797611 2937 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:56:37.801188 kubelet[2937]: I0912 17:56:37.800595 2937 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:56:37.804112 kubelet[2937]: I0912 17:56:37.804054 2937 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:56:37.804296 kubelet[2937]: I0912 17:56:37.804108 2937 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:56:37.804371 kubelet[2937]: I0912 17:56:37.804303 2937 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:56:37.804371 kubelet[2937]: I0912 17:56:37.804309 2937 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 17:56:37.804371 kubelet[2937]: I0912 17:56:37.804353 2937 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:56:37.805064 kubelet[2937]: I0912 17:56:37.804517 2937 kubelet.go:446] "Attempting to sync node with API server" Sep 12 17:56:37.805064 kubelet[2937]: I0912 17:56:37.804537 2937 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:56:37.805064 kubelet[2937]: I0912 17:56:37.804560 2937 kubelet.go:352] "Adding apiserver pod source" Sep 12 17:56:37.805064 kubelet[2937]: I0912 17:56:37.804578 2937 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:56:37.834197 kubelet[2937]: I0912 17:56:37.834085 2937 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:56:37.841166 kubelet[2937]: I0912 17:56:37.840642 2937 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:56:37.841297 kubelet[2937]: I0912 17:56:37.841286 2937 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:56:37.841369 kubelet[2937]: I0912 17:56:37.841363 2937 server.go:1287] "Started kubelet" Sep 12 17:56:37.844230 kubelet[2937]: I0912 17:56:37.843634 2937 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:56:37.845499 kubelet[2937]: I0912 17:56:37.844750 2937 server.go:479] "Adding debug handlers to kubelet server" Sep 12 17:56:37.845671 kubelet[2937]: I0912 17:56:37.845623 2937 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:56:37.846787 kubelet[2937]: I0912 17:56:37.846763 2937 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:56:37.850255 kubelet[2937]: I0912 17:56:37.850217 2937 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:56:37.851415 kubelet[2937]: I0912 17:56:37.850988 2937 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:56:37.857456 kubelet[2937]: I0912 17:56:37.856749 2937 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:56:37.857456 kubelet[2937]: I0912 17:56:37.856837 2937 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:56:37.857456 kubelet[2937]: I0912 17:56:37.856925 2937 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:56:37.862747 kubelet[2937]: E0912 17:56:37.862715 2937 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:56:37.864105 kubelet[2937]: I0912 17:56:37.863816 2937 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:56:37.864105 kubelet[2937]: I0912 17:56:37.863829 2937 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:56:37.864105 kubelet[2937]: I0912 17:56:37.863904 2937 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:56:37.868719 kubelet[2937]: I0912 17:56:37.868248 2937 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:56:37.870256 kubelet[2937]: I0912 17:56:37.869924 2937 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:56:37.870256 kubelet[2937]: I0912 17:56:37.869946 2937 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 17:56:37.870256 kubelet[2937]: I0912 17:56:37.869959 2937 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:56:37.870256 kubelet[2937]: I0912 17:56:37.869963 2937 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 17:56:37.870256 kubelet[2937]: E0912 17:56:37.869989 2937 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:56:37.922638 kubelet[2937]: I0912 17:56:37.922618 2937 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:56:37.922638 kubelet[2937]: I0912 17:56:37.922631 2937 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:56:37.922638 kubelet[2937]: I0912 17:56:37.922644 2937 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:56:37.922761 kubelet[2937]: I0912 17:56:37.922752 2937 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:56:37.922779 kubelet[2937]: I0912 17:56:37.922758 2937 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:56:37.922779 kubelet[2937]: I0912 17:56:37.922770 2937 policy_none.go:49] "None policy: Start" Sep 12 17:56:37.922779 kubelet[2937]: I0912 17:56:37.922776 2937 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:56:37.922824 kubelet[2937]: I0912 17:56:37.922782 2937 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:56:37.922859 kubelet[2937]: I0912 17:56:37.922847 2937 state_mem.go:75] "Updated machine memory state" Sep 12 17:56:37.925543 kubelet[2937]: I0912 17:56:37.925327 2937 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:56:37.925543 kubelet[2937]: I0912 17:56:37.925422 2937 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:56:37.925543 kubelet[2937]: I0912 17:56:37.925428 2937 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:56:37.926073 kubelet[2937]: I0912 17:56:37.925980 2937 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:56:37.928850 kubelet[2937]: E0912 17:56:37.928834 2937 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:56:37.971345 kubelet[2937]: I0912 17:56:37.970479 2937 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:37.975759 kubelet[2937]: I0912 17:56:37.975609 2937 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 17:56:37.975759 kubelet[2937]: I0912 17:56:37.975644 2937 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:37.978134 kubelet[2937]: E0912 17:56:37.978087 2937 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:37.980147 kubelet[2937]: E0912 17:56:37.979516 2937 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 12 17:56:37.981030 kubelet[2937]: E0912 17:56:37.980743 2937 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:38.032790 kubelet[2937]: I0912 17:56:38.031020 2937 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 17:56:38.036527 kubelet[2937]: I0912 17:56:38.036480 2937 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 12 17:56:38.036626 kubelet[2937]: I0912 17:56:38.036538 2937 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 17:56:38.158615 kubelet[2937]: I0912 17:56:38.158590 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:38.158615 kubelet[2937]: I0912 17:56:38.158611 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bf8bbbbf2b1970570cb10c6a565ab7d2-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"bf8bbbbf2b1970570cb10c6a565ab7d2\") " pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:38.158920 kubelet[2937]: I0912 17:56:38.158625 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bf8bbbbf2b1970570cb10c6a565ab7d2-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"bf8bbbbf2b1970570cb10c6a565ab7d2\") " pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:38.158920 kubelet[2937]: I0912 17:56:38.158635 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bf8bbbbf2b1970570cb10c6a565ab7d2-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"bf8bbbbf2b1970570cb10c6a565ab7d2\") " pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:38.158920 kubelet[2937]: I0912 17:56:38.158645 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:38.158920 kubelet[2937]: I0912 17:56:38.158654 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:38.158920 kubelet[2937]: I0912 17:56:38.158662 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:38.159297 kubelet[2937]: I0912 17:56:38.158671 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 17:56:38.159297 kubelet[2937]: I0912 17:56:38.158680 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 12 17:56:38.833728 kubelet[2937]: I0912 17:56:38.833667 2937 apiserver.go:52] "Watching apiserver" Sep 12 17:56:38.857836 kubelet[2937]: I0912 17:56:38.857807 2937 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:56:38.906670 kubelet[2937]: I0912 17:56:38.906527 2937 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 17:56:38.906769 kubelet[2937]: I0912 17:56:38.906763 2937 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:38.925848 kubelet[2937]: E0912 17:56:38.925829 2937 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 12 17:56:38.927381 kubelet[2937]: E0912 17:56:38.927290 2937 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 12 17:56:38.950682 kubelet[2937]: I0912 17:56:38.950648 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.950538319 podStartE2EDuration="3.950538319s" podCreationTimestamp="2025-09-12 17:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:56:38.938131912 +0000 UTC m=+1.209266264" watchObservedRunningTime="2025-09-12 17:56:38.950538319 +0000 UTC m=+1.221672671" Sep 12 17:56:38.982501 kubelet[2937]: I0912 17:56:38.982358 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=3.9823465479999998 podStartE2EDuration="3.982346548s" podCreationTimestamp="2025-09-12 17:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:56:38.950975603 +0000 UTC m=+1.222109949" watchObservedRunningTime="2025-09-12 17:56:38.982346548 +0000 UTC m=+1.253480893" Sep 12 17:56:38.982501 kubelet[2937]: I0912 17:56:38.982440 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.9824358909999997 podStartE2EDuration="3.982435891s" podCreationTimestamp="2025-09-12 17:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:56:38.982113957 +0000 UTC m=+1.253248302" watchObservedRunningTime="2025-09-12 17:56:38.982435891 +0000 UTC m=+1.253570238" Sep 12 17:56:43.257966 kubelet[2937]: I0912 17:56:43.257942 2937 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:56:43.258593 containerd[1635]: time="2025-09-12T17:56:43.258569289Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:56:43.259017 kubelet[2937]: I0912 17:56:43.258848 2937 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:56:43.836585 systemd[1]: Created slice kubepods-besteffort-pod2ea4c2ba_7d3a_4748_9b85_262b3490533b.slice - libcontainer container kubepods-besteffort-pod2ea4c2ba_7d3a_4748_9b85_262b3490533b.slice. Sep 12 17:56:43.896173 kubelet[2937]: I0912 17:56:43.896135 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2ea4c2ba-7d3a-4748-9b85-262b3490533b-xtables-lock\") pod \"kube-proxy-97c5m\" (UID: \"2ea4c2ba-7d3a-4748-9b85-262b3490533b\") " pod="kube-system/kube-proxy-97c5m" Sep 12 17:56:43.896173 kubelet[2937]: I0912 17:56:43.896167 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2ea4c2ba-7d3a-4748-9b85-262b3490533b-kube-proxy\") pod \"kube-proxy-97c5m\" (UID: \"2ea4c2ba-7d3a-4748-9b85-262b3490533b\") " pod="kube-system/kube-proxy-97c5m" Sep 12 17:56:43.896173 kubelet[2937]: I0912 17:56:43.896180 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ea4c2ba-7d3a-4748-9b85-262b3490533b-lib-modules\") pod \"kube-proxy-97c5m\" (UID: \"2ea4c2ba-7d3a-4748-9b85-262b3490533b\") " pod="kube-system/kube-proxy-97c5m" Sep 12 17:56:43.896376 kubelet[2937]: I0912 17:56:43.896195 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwfp\" (UniqueName: \"kubernetes.io/projected/2ea4c2ba-7d3a-4748-9b85-262b3490533b-kube-api-access-7vwfp\") pod \"kube-proxy-97c5m\" (UID: \"2ea4c2ba-7d3a-4748-9b85-262b3490533b\") " pod="kube-system/kube-proxy-97c5m" Sep 12 17:56:44.004349 kubelet[2937]: E0912 17:56:44.003658 2937 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 12 17:56:44.004349 kubelet[2937]: E0912 17:56:44.003687 2937 projected.go:194] Error preparing data for projected volume kube-api-access-7vwfp for pod kube-system/kube-proxy-97c5m: configmap "kube-root-ca.crt" not found Sep 12 17:56:44.004349 kubelet[2937]: E0912 17:56:44.003750 2937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ea4c2ba-7d3a-4748-9b85-262b3490533b-kube-api-access-7vwfp podName:2ea4c2ba-7d3a-4748-9b85-262b3490533b nodeName:}" failed. No retries permitted until 2025-09-12 17:56:44.503725144 +0000 UTC m=+6.774859489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7vwfp" (UniqueName: "kubernetes.io/projected/2ea4c2ba-7d3a-4748-9b85-262b3490533b-kube-api-access-7vwfp") pod "kube-proxy-97c5m" (UID: "2ea4c2ba-7d3a-4748-9b85-262b3490533b") : configmap "kube-root-ca.crt" not found Sep 12 17:56:44.354355 systemd[1]: Created slice kubepods-besteffort-pod4d91a988_8fe2_4723_8bd1_718a9c59dc36.slice - libcontainer container kubepods-besteffort-pod4d91a988_8fe2_4723_8bd1_718a9c59dc36.slice. Sep 12 17:56:44.401411 kubelet[2937]: I0912 17:56:44.401384 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4d91a988-8fe2-4723-8bd1-718a9c59dc36-var-lib-calico\") pod \"tigera-operator-755d956888-66gq8\" (UID: \"4d91a988-8fe2-4723-8bd1-718a9c59dc36\") " pod="tigera-operator/tigera-operator-755d956888-66gq8" Sep 12 17:56:44.401784 kubelet[2937]: I0912 17:56:44.401741 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2l5\" (UniqueName: \"kubernetes.io/projected/4d91a988-8fe2-4723-8bd1-718a9c59dc36-kube-api-access-cw2l5\") pod \"tigera-operator-755d956888-66gq8\" (UID: \"4d91a988-8fe2-4723-8bd1-718a9c59dc36\") " pod="tigera-operator/tigera-operator-755d956888-66gq8" Sep 12 17:56:44.659591 containerd[1635]: time="2025-09-12T17:56:44.659339104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-66gq8,Uid:4d91a988-8fe2-4723-8bd1-718a9c59dc36,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:56:44.678324 containerd[1635]: time="2025-09-12T17:56:44.678211501Z" level=info msg="connecting to shim 8b78e7d8a4bea63cdf67dd803d11a1b958a2e69397c59401440e756fe9b97590" address="unix:///run/containerd/s/a8d6d5d7085a5ce916454007f2e175004574eb8a6c2b3b27dd27124b25a93123" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:56:44.706250 systemd[1]: Started cri-containerd-8b78e7d8a4bea63cdf67dd803d11a1b958a2e69397c59401440e756fe9b97590.scope - libcontainer container 8b78e7d8a4bea63cdf67dd803d11a1b958a2e69397c59401440e756fe9b97590. Sep 12 17:56:44.749315 containerd[1635]: time="2025-09-12T17:56:44.749267009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-97c5m,Uid:2ea4c2ba-7d3a-4748-9b85-262b3490533b,Namespace:kube-system,Attempt:0,}" Sep 12 17:56:44.754242 containerd[1635]: time="2025-09-12T17:56:44.754172819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-66gq8,Uid:4d91a988-8fe2-4723-8bd1-718a9c59dc36,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"8b78e7d8a4bea63cdf67dd803d11a1b958a2e69397c59401440e756fe9b97590\"" Sep 12 17:56:44.756126 containerd[1635]: time="2025-09-12T17:56:44.756053434Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:56:44.871596 containerd[1635]: time="2025-09-12T17:56:44.871508705Z" level=info msg="connecting to shim 7c6a99f9f8443521c75123ae40c2c405ec8c5a21ef8b2d19a8d96b7820f802c0" address="unix:///run/containerd/s/4827c288c40f24de1d5c3b383dcc5acc51d965cc856941adb59cad8d6b339598" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:56:44.894237 systemd[1]: Started cri-containerd-7c6a99f9f8443521c75123ae40c2c405ec8c5a21ef8b2d19a8d96b7820f802c0.scope - libcontainer container 7c6a99f9f8443521c75123ae40c2c405ec8c5a21ef8b2d19a8d96b7820f802c0. Sep 12 17:56:44.916679 containerd[1635]: time="2025-09-12T17:56:44.916292495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-97c5m,Uid:2ea4c2ba-7d3a-4748-9b85-262b3490533b,Namespace:kube-system,Attempt:0,} returns sandbox id \"7c6a99f9f8443521c75123ae40c2c405ec8c5a21ef8b2d19a8d96b7820f802c0\"" Sep 12 17:56:44.919875 containerd[1635]: time="2025-09-12T17:56:44.919407590Z" level=info msg="CreateContainer within sandbox \"7c6a99f9f8443521c75123ae40c2c405ec8c5a21ef8b2d19a8d96b7820f802c0\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:56:44.926568 containerd[1635]: time="2025-09-12T17:56:44.926520318Z" level=info msg="Container b72a9f11cb9c78343ec0ecb074f68fd84817e156bb7897dd9359ceb25c60073a: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:56:44.930773 containerd[1635]: time="2025-09-12T17:56:44.930687317Z" level=info msg="CreateContainer within sandbox \"7c6a99f9f8443521c75123ae40c2c405ec8c5a21ef8b2d19a8d96b7820f802c0\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b72a9f11cb9c78343ec0ecb074f68fd84817e156bb7897dd9359ceb25c60073a\"" Sep 12 17:56:44.931189 containerd[1635]: time="2025-09-12T17:56:44.931176255Z" level=info msg="StartContainer for \"b72a9f11cb9c78343ec0ecb074f68fd84817e156bb7897dd9359ceb25c60073a\"" Sep 12 17:56:44.933323 containerd[1635]: time="2025-09-12T17:56:44.933280889Z" level=info msg="connecting to shim b72a9f11cb9c78343ec0ecb074f68fd84817e156bb7897dd9359ceb25c60073a" address="unix:///run/containerd/s/4827c288c40f24de1d5c3b383dcc5acc51d965cc856941adb59cad8d6b339598" protocol=ttrpc version=3 Sep 12 17:56:44.954226 systemd[1]: Started cri-containerd-b72a9f11cb9c78343ec0ecb074f68fd84817e156bb7897dd9359ceb25c60073a.scope - libcontainer container b72a9f11cb9c78343ec0ecb074f68fd84817e156bb7897dd9359ceb25c60073a. Sep 12 17:56:44.999063 containerd[1635]: time="2025-09-12T17:56:44.999032873Z" level=info msg="StartContainer for \"b72a9f11cb9c78343ec0ecb074f68fd84817e156bb7897dd9359ceb25c60073a\" returns successfully" Sep 12 17:56:45.937379 kubelet[2937]: I0912 17:56:45.937226 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-97c5m" podStartSLOduration=2.937165506 podStartE2EDuration="2.937165506s" podCreationTimestamp="2025-09-12 17:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:56:45.936839841 +0000 UTC m=+8.207974196" watchObservedRunningTime="2025-09-12 17:56:45.937165506 +0000 UTC m=+8.208299861" Sep 12 17:56:46.202474 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1877905904.mount: Deactivated successfully. Sep 12 17:56:46.687837 containerd[1635]: time="2025-09-12T17:56:46.687451034Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:46.688155 containerd[1635]: time="2025-09-12T17:56:46.688143328Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 17:56:46.688537 containerd[1635]: time="2025-09-12T17:56:46.688524712Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:46.689649 containerd[1635]: time="2025-09-12T17:56:46.689637033Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:46.690058 containerd[1635]: time="2025-09-12T17:56:46.690045931Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.933963682s" Sep 12 17:56:46.690112 containerd[1635]: time="2025-09-12T17:56:46.690104014Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 17:56:46.692767 containerd[1635]: time="2025-09-12T17:56:46.691940557Z" level=info msg="CreateContainer within sandbox \"8b78e7d8a4bea63cdf67dd803d11a1b958a2e69397c59401440e756fe9b97590\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:56:46.695564 containerd[1635]: time="2025-09-12T17:56:46.695456006Z" level=info msg="Container d4bf65d2591be2507a71a8037ceb8ba5658dad8b66503bce15d098a972553b5c: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:56:46.699963 containerd[1635]: time="2025-09-12T17:56:46.699947783Z" level=info msg="CreateContainer within sandbox \"8b78e7d8a4bea63cdf67dd803d11a1b958a2e69397c59401440e756fe9b97590\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d4bf65d2591be2507a71a8037ceb8ba5658dad8b66503bce15d098a972553b5c\"" Sep 12 17:56:46.700649 containerd[1635]: time="2025-09-12T17:56:46.700516291Z" level=info msg="StartContainer for \"d4bf65d2591be2507a71a8037ceb8ba5658dad8b66503bce15d098a972553b5c\"" Sep 12 17:56:46.701819 containerd[1635]: time="2025-09-12T17:56:46.701728546Z" level=info msg="connecting to shim d4bf65d2591be2507a71a8037ceb8ba5658dad8b66503bce15d098a972553b5c" address="unix:///run/containerd/s/a8d6d5d7085a5ce916454007f2e175004574eb8a6c2b3b27dd27124b25a93123" protocol=ttrpc version=3 Sep 12 17:56:46.722095 systemd[1]: Started cri-containerd-d4bf65d2591be2507a71a8037ceb8ba5658dad8b66503bce15d098a972553b5c.scope - libcontainer container d4bf65d2591be2507a71a8037ceb8ba5658dad8b66503bce15d098a972553b5c. Sep 12 17:56:46.743444 containerd[1635]: time="2025-09-12T17:56:46.743407500Z" level=info msg="StartContainer for \"d4bf65d2591be2507a71a8037ceb8ba5658dad8b66503bce15d098a972553b5c\" returns successfully" Sep 12 17:56:48.824855 kubelet[2937]: I0912 17:56:48.824820 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-66gq8" podStartSLOduration=2.889806128 podStartE2EDuration="4.824807628s" podCreationTimestamp="2025-09-12 17:56:44 +0000 UTC" firstStartedPulling="2025-09-12 17:56:44.755655331 +0000 UTC m=+7.026789680" lastFinishedPulling="2025-09-12 17:56:46.69065684 +0000 UTC m=+8.961791180" observedRunningTime="2025-09-12 17:56:46.928809545 +0000 UTC m=+9.199943897" watchObservedRunningTime="2025-09-12 17:56:48.824807628 +0000 UTC m=+11.095941974" Sep 12 17:56:53.050091 sudo[1957]: pam_unix(sudo:session): session closed for user root Sep 12 17:56:53.052207 sshd[1956]: Connection closed by 139.178.89.65 port 51262 Sep 12 17:56:53.053980 sshd-session[1953]: pam_unix(sshd:session): session closed for user core Sep 12 17:56:53.058913 systemd[1]: sshd@6-139.178.70.102:22-139.178.89.65:51262.service: Deactivated successfully. Sep 12 17:56:53.062891 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:56:53.064232 systemd[1]: session-9.scope: Consumed 3.033s CPU time, 158.5M memory peak. Sep 12 17:56:53.069354 systemd-logind[1603]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:56:53.071688 systemd-logind[1603]: Removed session 9. Sep 12 17:56:54.575180 systemd[1]: Created slice kubepods-besteffort-pod96cdb326_ea42_4267_aa2f_d8ce80058b08.slice - libcontainer container kubepods-besteffort-pod96cdb326_ea42_4267_aa2f_d8ce80058b08.slice. Sep 12 17:56:54.669614 kubelet[2937]: I0912 17:56:54.669579 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/96cdb326-ea42-4267-aa2f-d8ce80058b08-typha-certs\") pod \"calico-typha-59f8cc8fcb-vkdbx\" (UID: \"96cdb326-ea42-4267-aa2f-d8ce80058b08\") " pod="calico-system/calico-typha-59f8cc8fcb-vkdbx" Sep 12 17:56:54.669614 kubelet[2937]: I0912 17:56:54.669612 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spm8n\" (UniqueName: \"kubernetes.io/projected/96cdb326-ea42-4267-aa2f-d8ce80058b08-kube-api-access-spm8n\") pod \"calico-typha-59f8cc8fcb-vkdbx\" (UID: \"96cdb326-ea42-4267-aa2f-d8ce80058b08\") " pod="calico-system/calico-typha-59f8cc8fcb-vkdbx" Sep 12 17:56:54.669957 kubelet[2937]: I0912 17:56:54.669633 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96cdb326-ea42-4267-aa2f-d8ce80058b08-tigera-ca-bundle\") pod \"calico-typha-59f8cc8fcb-vkdbx\" (UID: \"96cdb326-ea42-4267-aa2f-d8ce80058b08\") " pod="calico-system/calico-typha-59f8cc8fcb-vkdbx" Sep 12 17:56:54.701879 systemd[1]: Created slice kubepods-besteffort-podab200089_a823_42b6_b557_2236a09c4002.slice - libcontainer container kubepods-besteffort-podab200089_a823_42b6_b557_2236a09c4002.slice. Sep 12 17:56:54.771696 kubelet[2937]: I0912 17:56:54.771134 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-cni-net-dir\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771696 kubelet[2937]: I0912 17:56:54.771177 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-lib-modules\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771696 kubelet[2937]: I0912 17:56:54.771202 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-flexvol-driver-host\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771696 kubelet[2937]: I0912 17:56:54.771219 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-xtables-lock\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771696 kubelet[2937]: I0912 17:56:54.771241 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-cni-log-dir\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771885 kubelet[2937]: I0912 17:56:54.771257 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ab200089-a823-42b6-b557-2236a09c4002-node-certs\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771885 kubelet[2937]: I0912 17:56:54.771273 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-var-lib-calico\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771885 kubelet[2937]: I0912 17:56:54.771311 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-policysync\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771885 kubelet[2937]: I0912 17:56:54.771331 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-cni-bin-dir\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771885 kubelet[2937]: I0912 17:56:54.771344 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkhlk\" (UniqueName: \"kubernetes.io/projected/ab200089-a823-42b6-b557-2236a09c4002-kube-api-access-wkhlk\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771990 kubelet[2937]: I0912 17:56:54.771355 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ab200089-a823-42b6-b557-2236a09c4002-var-run-calico\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.771990 kubelet[2937]: I0912 17:56:54.771371 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab200089-a823-42b6-b557-2236a09c4002-tigera-ca-bundle\") pod \"calico-node-m5kxk\" (UID: \"ab200089-a823-42b6-b557-2236a09c4002\") " pod="calico-system/calico-node-m5kxk" Sep 12 17:56:54.883827 containerd[1635]: time="2025-09-12T17:56:54.883759745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59f8cc8fcb-vkdbx,Uid:96cdb326-ea42-4267-aa2f-d8ce80058b08,Namespace:calico-system,Attempt:0,}" Sep 12 17:56:54.922045 kubelet[2937]: E0912 17:56:54.921990 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:54.922045 kubelet[2937]: W0912 17:56:54.922043 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:54.924751 kubelet[2937]: E0912 17:56:54.922091 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.008623 kubelet[2937]: E0912 17:56:55.007463 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wgskv" podUID="990867cb-2d26-443c-9869-c2147978654b" Sep 12 17:56:55.008726 containerd[1635]: time="2025-09-12T17:56:55.008223568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m5kxk,Uid:ab200089-a823-42b6-b557-2236a09c4002,Namespace:calico-system,Attempt:0,}" Sep 12 17:56:55.056029 containerd[1635]: time="2025-09-12T17:56:55.055860506Z" level=info msg="connecting to shim 297cd010cd433e38775d65d68af7b63bf72f574faeb374490372c595c7c8b761" address="unix:///run/containerd/s/66878caf678110cadb57c9164ded91b7340938f272673755cf5220e728c646a1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:56:55.058191 containerd[1635]: time="2025-09-12T17:56:55.058153161Z" level=info msg="connecting to shim 8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5" address="unix:///run/containerd/s/c04841d09263d34082bec4a11233ef6ef28ab428b7c2a788987935338b0d5946" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:56:55.061215 kubelet[2937]: E0912 17:56:55.061112 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.061215 kubelet[2937]: W0912 17:56:55.061133 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.061215 kubelet[2937]: E0912 17:56:55.061150 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.061717 kubelet[2937]: E0912 17:56:55.061246 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.061717 kubelet[2937]: W0912 17:56:55.061253 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.061717 kubelet[2937]: E0912 17:56:55.061261 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.061717 kubelet[2937]: E0912 17:56:55.061358 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.061717 kubelet[2937]: W0912 17:56:55.061363 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.061717 kubelet[2937]: E0912 17:56:55.061368 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.061717 kubelet[2937]: E0912 17:56:55.061479 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.061717 kubelet[2937]: W0912 17:56:55.061485 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.061717 kubelet[2937]: E0912 17:56:55.061491 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.061717 kubelet[2937]: E0912 17:56:55.061665 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.062676 kubelet[2937]: W0912 17:56:55.061673 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.062676 kubelet[2937]: E0912 17:56:55.061681 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.062676 kubelet[2937]: E0912 17:56:55.061793 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.062676 kubelet[2937]: W0912 17:56:55.061800 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.062676 kubelet[2937]: E0912 17:56:55.061809 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.062676 kubelet[2937]: E0912 17:56:55.061904 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.062676 kubelet[2937]: W0912 17:56:55.061909 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.062676 kubelet[2937]: E0912 17:56:55.062065 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.062676 kubelet[2937]: E0912 17:56:55.062620 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.062676 kubelet[2937]: W0912 17:56:55.062628 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.063303 kubelet[2937]: E0912 17:56:55.062637 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.063303 kubelet[2937]: E0912 17:56:55.063041 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.063303 kubelet[2937]: W0912 17:56:55.063048 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.063303 kubelet[2937]: E0912 17:56:55.063057 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.063303 kubelet[2937]: E0912 17:56:55.063159 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.063303 kubelet[2937]: W0912 17:56:55.063165 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.063303 kubelet[2937]: E0912 17:56:55.063170 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.063303 kubelet[2937]: E0912 17:56:55.063255 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.063303 kubelet[2937]: W0912 17:56:55.063262 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.063303 kubelet[2937]: E0912 17:56:55.063271 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.063507 kubelet[2937]: E0912 17:56:55.063366 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.063507 kubelet[2937]: W0912 17:56:55.063373 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.063507 kubelet[2937]: E0912 17:56:55.063381 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.063507 kubelet[2937]: E0912 17:56:55.063484 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.063507 kubelet[2937]: W0912 17:56:55.063491 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.063507 kubelet[2937]: E0912 17:56:55.063500 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.063611 kubelet[2937]: E0912 17:56:55.063580 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.063611 kubelet[2937]: W0912 17:56:55.063587 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.063611 kubelet[2937]: E0912 17:56:55.063594 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.065533 kubelet[2937]: E0912 17:56:55.063689 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.065533 kubelet[2937]: W0912 17:56:55.063699 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.065533 kubelet[2937]: E0912 17:56:55.063709 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.065533 kubelet[2937]: E0912 17:56:55.063789 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.065533 kubelet[2937]: W0912 17:56:55.063794 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.065533 kubelet[2937]: E0912 17:56:55.063801 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.065533 kubelet[2937]: E0912 17:56:55.063879 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.065533 kubelet[2937]: W0912 17:56:55.063883 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.065533 kubelet[2937]: E0912 17:56:55.063887 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.065533 kubelet[2937]: E0912 17:56:55.063972 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.066731 kubelet[2937]: W0912 17:56:55.063977 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.066731 kubelet[2937]: E0912 17:56:55.063982 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.066731 kubelet[2937]: E0912 17:56:55.065402 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.066731 kubelet[2937]: W0912 17:56:55.065418 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.066731 kubelet[2937]: E0912 17:56:55.065434 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.066731 kubelet[2937]: E0912 17:56:55.065614 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.066731 kubelet[2937]: W0912 17:56:55.065619 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.066731 kubelet[2937]: E0912 17:56:55.065626 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.074517 kubelet[2937]: E0912 17:56:55.074209 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.074517 kubelet[2937]: W0912 17:56:55.074312 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.074517 kubelet[2937]: E0912 17:56:55.074327 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.074517 kubelet[2937]: I0912 17:56:55.074351 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/990867cb-2d26-443c-9869-c2147978654b-kubelet-dir\") pod \"csi-node-driver-wgskv\" (UID: \"990867cb-2d26-443c-9869-c2147978654b\") " pod="calico-system/csi-node-driver-wgskv" Sep 12 17:56:55.083824 kubelet[2937]: E0912 17:56:55.074916 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.083824 kubelet[2937]: W0912 17:56:55.074923 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.083824 kubelet[2937]: E0912 17:56:55.074937 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.083824 kubelet[2937]: I0912 17:56:55.074951 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/990867cb-2d26-443c-9869-c2147978654b-varrun\") pod \"csi-node-driver-wgskv\" (UID: \"990867cb-2d26-443c-9869-c2147978654b\") " pod="calico-system/csi-node-driver-wgskv" Sep 12 17:56:55.083824 kubelet[2937]: E0912 17:56:55.075179 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.083824 kubelet[2937]: W0912 17:56:55.075185 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.083824 kubelet[2937]: E0912 17:56:55.075199 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.083824 kubelet[2937]: I0912 17:56:55.075210 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/990867cb-2d26-443c-9869-c2147978654b-socket-dir\") pod \"csi-node-driver-wgskv\" (UID: \"990867cb-2d26-443c-9869-c2147978654b\") " pod="calico-system/csi-node-driver-wgskv" Sep 12 17:56:55.083824 kubelet[2937]: E0912 17:56:55.075322 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084245 kubelet[2937]: W0912 17:56:55.075328 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084245 kubelet[2937]: E0912 17:56:55.075338 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084245 kubelet[2937]: E0912 17:56:55.075441 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084245 kubelet[2937]: W0912 17:56:55.075447 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084245 kubelet[2937]: E0912 17:56:55.075460 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084245 kubelet[2937]: E0912 17:56:55.075574 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084245 kubelet[2937]: W0912 17:56:55.075579 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084245 kubelet[2937]: E0912 17:56:55.075590 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084245 kubelet[2937]: E0912 17:56:55.075704 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084245 kubelet[2937]: W0912 17:56:55.075709 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084436 kubelet[2937]: E0912 17:56:55.075717 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084436 kubelet[2937]: E0912 17:56:55.075897 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084436 kubelet[2937]: W0912 17:56:55.075901 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084436 kubelet[2937]: E0912 17:56:55.075923 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084436 kubelet[2937]: I0912 17:56:55.075937 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/990867cb-2d26-443c-9869-c2147978654b-registration-dir\") pod \"csi-node-driver-wgskv\" (UID: \"990867cb-2d26-443c-9869-c2147978654b\") " pod="calico-system/csi-node-driver-wgskv" Sep 12 17:56:55.084436 kubelet[2937]: E0912 17:56:55.076199 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084436 kubelet[2937]: W0912 17:56:55.076206 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084436 kubelet[2937]: E0912 17:56:55.076216 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084569 kubelet[2937]: I0912 17:56:55.076227 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k4vq\" (UniqueName: \"kubernetes.io/projected/990867cb-2d26-443c-9869-c2147978654b-kube-api-access-4k4vq\") pod \"csi-node-driver-wgskv\" (UID: \"990867cb-2d26-443c-9869-c2147978654b\") " pod="calico-system/csi-node-driver-wgskv" Sep 12 17:56:55.084569 kubelet[2937]: E0912 17:56:55.076459 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084569 kubelet[2937]: W0912 17:56:55.076465 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084569 kubelet[2937]: E0912 17:56:55.076471 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084569 kubelet[2937]: E0912 17:56:55.076627 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084569 kubelet[2937]: W0912 17:56:55.076632 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084569 kubelet[2937]: E0912 17:56:55.076637 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084569 kubelet[2937]: E0912 17:56:55.076826 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084569 kubelet[2937]: W0912 17:56:55.076831 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084715 kubelet[2937]: E0912 17:56:55.076837 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084715 kubelet[2937]: E0912 17:56:55.076973 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084715 kubelet[2937]: W0912 17:56:55.076979 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084715 kubelet[2937]: E0912 17:56:55.076984 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084715 kubelet[2937]: E0912 17:56:55.077193 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084715 kubelet[2937]: W0912 17:56:55.077199 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084715 kubelet[2937]: E0912 17:56:55.077205 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.084715 kubelet[2937]: E0912 17:56:55.077710 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.084715 kubelet[2937]: W0912 17:56:55.077717 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.084715 kubelet[2937]: E0912 17:56:55.077723 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.098177 systemd[1]: Started cri-containerd-8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5.scope - libcontainer container 8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5. Sep 12 17:56:55.102944 systemd[1]: Started cri-containerd-297cd010cd433e38775d65d68af7b63bf72f574faeb374490372c595c7c8b761.scope - libcontainer container 297cd010cd433e38775d65d68af7b63bf72f574faeb374490372c595c7c8b761. Sep 12 17:56:55.178179 kubelet[2937]: E0912 17:56:55.178096 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.178179 kubelet[2937]: W0912 17:56:55.178114 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.178179 kubelet[2937]: E0912 17:56:55.178134 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.178358 kubelet[2937]: E0912 17:56:55.178305 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.178358 kubelet[2937]: W0912 17:56:55.178313 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.178358 kubelet[2937]: E0912 17:56:55.178322 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.179208 kubelet[2937]: E0912 17:56:55.179170 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.179208 kubelet[2937]: W0912 17:56:55.179185 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.179208 kubelet[2937]: E0912 17:56:55.179200 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.179595 kubelet[2937]: E0912 17:56:55.179304 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.179595 kubelet[2937]: W0912 17:56:55.179309 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.179595 kubelet[2937]: E0912 17:56:55.179315 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.180289 kubelet[2937]: E0912 17:56:55.179667 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.180289 kubelet[2937]: W0912 17:56:55.179673 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.180289 kubelet[2937]: E0912 17:56:55.179683 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.180289 kubelet[2937]: E0912 17:56:55.179973 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.180289 kubelet[2937]: W0912 17:56:55.179986 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.180289 kubelet[2937]: E0912 17:56:55.180044 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.180860 kubelet[2937]: E0912 17:56:55.180236 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.180860 kubelet[2937]: W0912 17:56:55.180589 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.180860 kubelet[2937]: E0912 17:56:55.180604 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.181314 kubelet[2937]: E0912 17:56:55.181180 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.181314 kubelet[2937]: W0912 17:56:55.181194 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.181314 kubelet[2937]: E0912 17:56:55.181235 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.181792 kubelet[2937]: E0912 17:56:55.181555 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.181792 kubelet[2937]: W0912 17:56:55.181563 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.182243 kubelet[2937]: E0912 17:56:55.182038 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.182659 kubelet[2937]: E0912 17:56:55.182308 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.182659 kubelet[2937]: W0912 17:56:55.182334 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.182659 kubelet[2937]: E0912 17:56:55.182377 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.183081 kubelet[2937]: E0912 17:56:55.182991 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.183218 kubelet[2937]: W0912 17:56:55.183148 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.183591 kubelet[2937]: E0912 17:56:55.183289 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.187469 kubelet[2937]: E0912 17:56:55.187113 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.187469 kubelet[2937]: W0912 17:56:55.187169 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.187469 kubelet[2937]: E0912 17:56:55.187209 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.191434 kubelet[2937]: E0912 17:56:55.191411 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.191673 kubelet[2937]: W0912 17:56:55.191546 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.191673 kubelet[2937]: E0912 17:56:55.191645 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.191853 kubelet[2937]: E0912 17:56:55.191833 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.191853 kubelet[2937]: W0912 17:56:55.191843 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.192023 kubelet[2937]: E0912 17:56:55.191991 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.192119 kubelet[2937]: E0912 17:56:55.192101 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.192119 kubelet[2937]: W0912 17:56:55.192110 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.192249 kubelet[2937]: E0912 17:56:55.192237 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.192337 kubelet[2937]: E0912 17:56:55.192318 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.192404 kubelet[2937]: W0912 17:56:55.192369 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.192483 kubelet[2937]: E0912 17:56:55.192471 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.192568 kubelet[2937]: E0912 17:56:55.192556 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.192568 kubelet[2937]: W0912 17:56:55.192562 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.192706 kubelet[2937]: E0912 17:56:55.192692 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.192825 kubelet[2937]: E0912 17:56:55.192805 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.192825 kubelet[2937]: W0912 17:56:55.192813 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.192949 kubelet[2937]: E0912 17:56:55.192895 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.194053 kubelet[2937]: E0912 17:56:55.193863 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.194053 kubelet[2937]: W0912 17:56:55.193881 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.194053 kubelet[2937]: E0912 17:56:55.193900 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.195112 kubelet[2937]: E0912 17:56:55.194679 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.195112 kubelet[2937]: W0912 17:56:55.194693 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.195112 kubelet[2937]: E0912 17:56:55.194779 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.195112 kubelet[2937]: E0912 17:56:55.194894 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.195112 kubelet[2937]: W0912 17:56:55.194901 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.195112 kubelet[2937]: E0912 17:56:55.194978 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.195112 kubelet[2937]: E0912 17:56:55.195079 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.195112 kubelet[2937]: W0912 17:56:55.195084 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.195378 kubelet[2937]: E0912 17:56:55.195159 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.195378 kubelet[2937]: E0912 17:56:55.195253 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.195378 kubelet[2937]: W0912 17:56:55.195258 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.195378 kubelet[2937]: E0912 17:56:55.195330 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.195450 kubelet[2937]: E0912 17:56:55.195403 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.195450 kubelet[2937]: W0912 17:56:55.195409 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.195450 kubelet[2937]: E0912 17:56:55.195416 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.196460 kubelet[2937]: E0912 17:56:55.196296 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.196460 kubelet[2937]: W0912 17:56:55.196398 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.196460 kubelet[2937]: E0912 17:56:55.196414 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.218833 kubelet[2937]: E0912 17:56:55.218812 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:55.218833 kubelet[2937]: W0912 17:56:55.218827 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:55.218990 kubelet[2937]: E0912 17:56:55.218849 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:55.221523 containerd[1635]: time="2025-09-12T17:56:55.221493391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59f8cc8fcb-vkdbx,Uid:96cdb326-ea42-4267-aa2f-d8ce80058b08,Namespace:calico-system,Attempt:0,} returns sandbox id \"297cd010cd433e38775d65d68af7b63bf72f574faeb374490372c595c7c8b761\"" Sep 12 17:56:55.223230 containerd[1635]: time="2025-09-12T17:56:55.223205014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:56:55.234433 containerd[1635]: time="2025-09-12T17:56:55.234388820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m5kxk,Uid:ab200089-a823-42b6-b557-2236a09c4002,Namespace:calico-system,Attempt:0,} returns sandbox id \"8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5\"" Sep 12 17:56:56.876478 kubelet[2937]: E0912 17:56:56.876430 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wgskv" podUID="990867cb-2d26-443c-9869-c2147978654b" Sep 12 17:56:57.319331 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3633145734.mount: Deactivated successfully. Sep 12 17:56:58.387039 containerd[1635]: time="2025-09-12T17:56:58.386922861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:58.396794 containerd[1635]: time="2025-09-12T17:56:58.396755648Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 17:56:58.431443 containerd[1635]: time="2025-09-12T17:56:58.431382471Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:58.448219 containerd[1635]: time="2025-09-12T17:56:58.448162190Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:56:58.448551 containerd[1635]: time="2025-09-12T17:56:58.448500726Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.225272541s" Sep 12 17:56:58.448551 containerd[1635]: time="2025-09-12T17:56:58.448518609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 17:56:58.450239 containerd[1635]: time="2025-09-12T17:56:58.450104635Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:56:58.470622 containerd[1635]: time="2025-09-12T17:56:58.470509390Z" level=info msg="CreateContainer within sandbox \"297cd010cd433e38775d65d68af7b63bf72f574faeb374490372c595c7c8b761\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:56:58.579756 containerd[1635]: time="2025-09-12T17:56:58.579120558Z" level=info msg="Container c7beb488c21e9427b2cee1a8a7f7f0e8cba245904c47711d35485d3284b9e22c: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:56:58.653141 containerd[1635]: time="2025-09-12T17:56:58.652814432Z" level=info msg="CreateContainer within sandbox \"297cd010cd433e38775d65d68af7b63bf72f574faeb374490372c595c7c8b761\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c7beb488c21e9427b2cee1a8a7f7f0e8cba245904c47711d35485d3284b9e22c\"" Sep 12 17:56:58.653579 containerd[1635]: time="2025-09-12T17:56:58.653547247Z" level=info msg="StartContainer for \"c7beb488c21e9427b2cee1a8a7f7f0e8cba245904c47711d35485d3284b9e22c\"" Sep 12 17:56:58.661656 containerd[1635]: time="2025-09-12T17:56:58.655154704Z" level=info msg="connecting to shim c7beb488c21e9427b2cee1a8a7f7f0e8cba245904c47711d35485d3284b9e22c" address="unix:///run/containerd/s/66878caf678110cadb57c9164ded91b7340938f272673755cf5220e728c646a1" protocol=ttrpc version=3 Sep 12 17:56:58.679155 systemd[1]: Started cri-containerd-c7beb488c21e9427b2cee1a8a7f7f0e8cba245904c47711d35485d3284b9e22c.scope - libcontainer container c7beb488c21e9427b2cee1a8a7f7f0e8cba245904c47711d35485d3284b9e22c. Sep 12 17:56:58.738320 containerd[1635]: time="2025-09-12T17:56:58.738246848Z" level=info msg="StartContainer for \"c7beb488c21e9427b2cee1a8a7f7f0e8cba245904c47711d35485d3284b9e22c\" returns successfully" Sep 12 17:56:58.870664 kubelet[2937]: E0912 17:56:58.870410 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wgskv" podUID="990867cb-2d26-443c-9869-c2147978654b" Sep 12 17:56:58.990585 kubelet[2937]: E0912 17:56:58.990438 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.990585 kubelet[2937]: W0912 17:56:58.990452 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.990585 kubelet[2937]: E0912 17:56:58.990464 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.990585 kubelet[2937]: E0912 17:56:58.990558 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.990585 kubelet[2937]: W0912 17:56:58.990572 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.990585 kubelet[2937]: E0912 17:56:58.990581 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991106 kubelet[2937]: E0912 17:56:58.990659 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991106 kubelet[2937]: W0912 17:56:58.990663 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991106 kubelet[2937]: E0912 17:56:58.990668 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991106 kubelet[2937]: E0912 17:56:58.990772 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991106 kubelet[2937]: W0912 17:56:58.990777 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991106 kubelet[2937]: E0912 17:56:58.990782 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991106 kubelet[2937]: E0912 17:56:58.990859 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991106 kubelet[2937]: W0912 17:56:58.990864 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991106 kubelet[2937]: E0912 17:56:58.990870 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991106 kubelet[2937]: E0912 17:56:58.990945 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991309 kubelet[2937]: W0912 17:56:58.990960 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991309 kubelet[2937]: E0912 17:56:58.990968 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991309 kubelet[2937]: E0912 17:56:58.991053 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991309 kubelet[2937]: W0912 17:56:58.991058 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991309 kubelet[2937]: E0912 17:56:58.991062 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991309 kubelet[2937]: E0912 17:56:58.991138 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991309 kubelet[2937]: W0912 17:56:58.991143 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991309 kubelet[2937]: E0912 17:56:58.991148 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991309 kubelet[2937]: E0912 17:56:58.991225 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991309 kubelet[2937]: W0912 17:56:58.991229 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991503 kubelet[2937]: E0912 17:56:58.991234 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991503 kubelet[2937]: E0912 17:56:58.991308 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991503 kubelet[2937]: W0912 17:56:58.991313 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991503 kubelet[2937]: E0912 17:56:58.991317 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991503 kubelet[2937]: E0912 17:56:58.991406 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991503 kubelet[2937]: W0912 17:56:58.991410 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991503 kubelet[2937]: E0912 17:56:58.991415 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991503 kubelet[2937]: E0912 17:56:58.991487 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991503 kubelet[2937]: W0912 17:56:58.991492 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991503 kubelet[2937]: E0912 17:56:58.991496 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991701 kubelet[2937]: E0912 17:56:58.991567 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991701 kubelet[2937]: W0912 17:56:58.991571 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991701 kubelet[2937]: E0912 17:56:58.991576 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991701 kubelet[2937]: E0912 17:56:58.991652 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991701 kubelet[2937]: W0912 17:56:58.991657 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991701 kubelet[2937]: E0912 17:56:58.991661 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:58.991919 kubelet[2937]: E0912 17:56:58.991738 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:58.991919 kubelet[2937]: W0912 17:56:58.991742 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:58.991919 kubelet[2937]: E0912 17:56:58.991749 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.014550 kubelet[2937]: E0912 17:56:59.014472 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.014550 kubelet[2937]: W0912 17:56:59.014486 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.014550 kubelet[2937]: E0912 17:56:59.014498 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.014774 kubelet[2937]: E0912 17:56:59.014717 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.014774 kubelet[2937]: W0912 17:56:59.014723 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.014774 kubelet[2937]: E0912 17:56:59.014729 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.014881 kubelet[2937]: E0912 17:56:59.014875 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.014978 kubelet[2937]: W0912 17:56:59.014911 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.014978 kubelet[2937]: E0912 17:56:59.014918 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.015138 kubelet[2937]: E0912 17:56:59.015081 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.015138 kubelet[2937]: W0912 17:56:59.015091 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.015138 kubelet[2937]: E0912 17:56:59.015097 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.015406 kubelet[2937]: E0912 17:56:59.015238 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.015406 kubelet[2937]: W0912 17:56:59.015244 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.015406 kubelet[2937]: E0912 17:56:59.015249 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.015524 kubelet[2937]: E0912 17:56:59.015509 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.015524 kubelet[2937]: W0912 17:56:59.015516 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.015697 kubelet[2937]: E0912 17:56:59.015577 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.037497 kubelet[2937]: E0912 17:56:59.037436 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.037497 kubelet[2937]: W0912 17:56:59.037455 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.037497 kubelet[2937]: E0912 17:56:59.037474 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.037820 kubelet[2937]: E0912 17:56:59.037749 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.037820 kubelet[2937]: W0912 17:56:59.037759 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.037820 kubelet[2937]: E0912 17:56:59.037774 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.052174 kubelet[2937]: E0912 17:56:59.037986 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.052174 kubelet[2937]: W0912 17:56:59.037991 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.052174 kubelet[2937]: E0912 17:56:59.038019 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.052174 kubelet[2937]: E0912 17:56:59.038135 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.052174 kubelet[2937]: W0912 17:56:59.038140 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.052174 kubelet[2937]: E0912 17:56:59.038154 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.052174 kubelet[2937]: E0912 17:56:59.038240 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.052174 kubelet[2937]: W0912 17:56:59.038246 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.052174 kubelet[2937]: E0912 17:56:59.038257 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.052174 kubelet[2937]: E0912 17:56:59.038398 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.052340 kubelet[2937]: W0912 17:56:59.038404 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.052340 kubelet[2937]: E0912 17:56:59.038415 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.052340 kubelet[2937]: E0912 17:56:59.038533 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.052340 kubelet[2937]: W0912 17:56:59.038538 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.052340 kubelet[2937]: E0912 17:56:59.038546 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.052340 kubelet[2937]: E0912 17:56:59.038643 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.052340 kubelet[2937]: W0912 17:56:59.038649 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.052340 kubelet[2937]: E0912 17:56:59.038665 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.052340 kubelet[2937]: E0912 17:56:59.038751 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.052340 kubelet[2937]: W0912 17:56:59.038756 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.054368 kubelet[2937]: E0912 17:56:59.038765 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.054368 kubelet[2937]: E0912 17:56:59.038875 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.054368 kubelet[2937]: W0912 17:56:59.038892 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.054368 kubelet[2937]: E0912 17:56:59.038904 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.054368 kubelet[2937]: E0912 17:56:59.039051 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.054368 kubelet[2937]: W0912 17:56:59.039057 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.054368 kubelet[2937]: E0912 17:56:59.039063 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.054368 kubelet[2937]: E0912 17:56:59.039228 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.054368 kubelet[2937]: W0912 17:56:59.039233 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.054368 kubelet[2937]: E0912 17:56:59.039238 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.054527 kubelet[2937]: I0912 17:56:59.050236 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-59f8cc8fcb-vkdbx" podStartSLOduration=1.823376265 podStartE2EDuration="5.050226063s" podCreationTimestamp="2025-09-12 17:56:54 +0000 UTC" firstStartedPulling="2025-09-12 17:56:55.222813896 +0000 UTC m=+17.493948242" lastFinishedPulling="2025-09-12 17:56:58.449663696 +0000 UTC m=+20.720798040" observedRunningTime="2025-09-12 17:56:59.049605255 +0000 UTC m=+21.320739599" watchObservedRunningTime="2025-09-12 17:56:59.050226063 +0000 UTC m=+21.321360410" Sep 12 17:56:59.958545 kubelet[2937]: I0912 17:56:59.958317 2937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:56:59.997788 kubelet[2937]: E0912 17:56:59.997767 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.997880 kubelet[2937]: W0912 17:56:59.997798 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.997880 kubelet[2937]: E0912 17:56:59.997814 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.997961 kubelet[2937]: E0912 17:56:59.997906 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.997961 kubelet[2937]: W0912 17:56:59.997911 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.997961 kubelet[2937]: E0912 17:56:59.997916 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998143 kubelet[2937]: E0912 17:56:59.998001 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998143 kubelet[2937]: W0912 17:56:59.998031 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998143 kubelet[2937]: E0912 17:56:59.998038 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998143 kubelet[2937]: E0912 17:56:59.998140 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998143 kubelet[2937]: W0912 17:56:59.998144 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998258 kubelet[2937]: E0912 17:56:59.998149 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998258 kubelet[2937]: E0912 17:56:59.998228 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998258 kubelet[2937]: W0912 17:56:59.998233 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998258 kubelet[2937]: E0912 17:56:59.998237 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998343 kubelet[2937]: E0912 17:56:59.998315 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998343 kubelet[2937]: W0912 17:56:59.998328 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998343 kubelet[2937]: E0912 17:56:59.998333 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998417 kubelet[2937]: E0912 17:56:59.998411 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998417 kubelet[2937]: W0912 17:56:59.998417 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998459 kubelet[2937]: E0912 17:56:59.998422 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998506 kubelet[2937]: E0912 17:56:59.998499 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998506 kubelet[2937]: W0912 17:56:59.998505 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998552 kubelet[2937]: E0912 17:56:59.998510 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998602 kubelet[2937]: E0912 17:56:59.998594 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998602 kubelet[2937]: W0912 17:56:59.998601 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998643 kubelet[2937]: E0912 17:56:59.998606 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998690 kubelet[2937]: E0912 17:56:59.998681 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998690 kubelet[2937]: W0912 17:56:59.998687 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998736 kubelet[2937]: E0912 17:56:59.998692 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998784 kubelet[2937]: E0912 17:56:59.998775 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998808 kubelet[2937]: W0912 17:56:59.998789 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998808 kubelet[2937]: E0912 17:56:59.998794 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998886 kubelet[2937]: E0912 17:56:59.998877 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998886 kubelet[2937]: W0912 17:56:59.998885 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.998931 kubelet[2937]: E0912 17:56:59.998890 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.998972 kubelet[2937]: E0912 17:56:59.998967 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.998972 kubelet[2937]: W0912 17:56:59.998972 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.999040 kubelet[2937]: E0912 17:56:59.998977 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.999076 kubelet[2937]: E0912 17:56:59.999066 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.999076 kubelet[2937]: W0912 17:56:59.999071 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.999076 kubelet[2937]: E0912 17:56:59.999075 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:56:59.999161 kubelet[2937]: E0912 17:56:59.999153 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:56:59.999161 kubelet[2937]: W0912 17:56:59.999160 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:56:59.999198 kubelet[2937]: E0912 17:56:59.999166 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.020656 kubelet[2937]: E0912 17:57:00.020634 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.020656 kubelet[2937]: W0912 17:57:00.020651 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.020838 kubelet[2937]: E0912 17:57:00.020664 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.020838 kubelet[2937]: E0912 17:57:00.020779 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.020838 kubelet[2937]: W0912 17:57:00.020784 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.020838 kubelet[2937]: E0912 17:57:00.020789 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.020961 kubelet[2937]: E0912 17:57:00.020872 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.020961 kubelet[2937]: W0912 17:57:00.020877 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.020961 kubelet[2937]: E0912 17:57:00.020884 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.021046 kubelet[2937]: E0912 17:57:00.020990 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.021046 kubelet[2937]: W0912 17:57:00.020995 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.021046 kubelet[2937]: E0912 17:57:00.021002 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.021124 kubelet[2937]: E0912 17:57:00.021115 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.021124 kubelet[2937]: W0912 17:57:00.021120 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.021183 kubelet[2937]: E0912 17:57:00.021128 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.021218 kubelet[2937]: E0912 17:57:00.021204 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.021218 kubelet[2937]: W0912 17:57:00.021209 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.021331 kubelet[2937]: E0912 17:57:00.021224 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.021391 kubelet[2937]: E0912 17:57:00.021384 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.021424 kubelet[2937]: W0912 17:57:00.021418 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.021465 kubelet[2937]: E0912 17:57:00.021458 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.021549 kubelet[2937]: E0912 17:57:00.021540 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.021549 kubelet[2937]: W0912 17:57:00.021549 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.021628 kubelet[2937]: E0912 17:57:00.021557 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.021628 kubelet[2937]: E0912 17:57:00.021623 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.021628 kubelet[2937]: W0912 17:57:00.021627 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.021690 kubelet[2937]: E0912 17:57:00.021632 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.021768 kubelet[2937]: E0912 17:57:00.021701 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.021768 kubelet[2937]: W0912 17:57:00.021706 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.021768 kubelet[2937]: E0912 17:57:00.021713 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.021950 kubelet[2937]: E0912 17:57:00.021856 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.021950 kubelet[2937]: W0912 17:57:00.021862 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.021950 kubelet[2937]: E0912 17:57:00.021872 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.022021 kubelet[2937]: E0912 17:57:00.021993 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.022021 kubelet[2937]: W0912 17:57:00.022003 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.022021 kubelet[2937]: E0912 17:57:00.022019 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.022112 kubelet[2937]: E0912 17:57:00.022104 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.022112 kubelet[2937]: W0912 17:57:00.022110 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.022152 kubelet[2937]: E0912 17:57:00.022115 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.022236 kubelet[2937]: E0912 17:57:00.022226 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.022236 kubelet[2937]: W0912 17:57:00.022233 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.022273 kubelet[2937]: E0912 17:57:00.022247 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.022414 kubelet[2937]: E0912 17:57:00.022408 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.022464 kubelet[2937]: W0912 17:57:00.022457 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.022507 kubelet[2937]: E0912 17:57:00.022501 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.022689 kubelet[2937]: E0912 17:57:00.022641 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.022689 kubelet[2937]: W0912 17:57:00.022648 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.022689 kubelet[2937]: E0912 17:57:00.022657 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.022856 kubelet[2937]: E0912 17:57:00.022849 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.022997 kubelet[2937]: W0912 17:57:00.022897 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.022997 kubelet[2937]: E0912 17:57:00.022906 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.023143 kubelet[2937]: E0912 17:57:00.023137 2937 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:57:00.023181 kubelet[2937]: W0912 17:57:00.023175 2937 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:57:00.023212 kubelet[2937]: E0912 17:57:00.023207 2937 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:57:00.526017 containerd[1635]: time="2025-09-12T17:57:00.525902833Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:00.536997 containerd[1635]: time="2025-09-12T17:57:00.536962110Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 17:57:00.561762 containerd[1635]: time="2025-09-12T17:57:00.561649414Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:00.592551 containerd[1635]: time="2025-09-12T17:57:00.592511499Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:00.594852 containerd[1635]: time="2025-09-12T17:57:00.593328059Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 2.143201807s" Sep 12 17:57:00.594852 containerd[1635]: time="2025-09-12T17:57:00.593355856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 17:57:00.595196 containerd[1635]: time="2025-09-12T17:57:00.595103559Z" level=info msg="CreateContainer within sandbox \"8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:57:00.633226 containerd[1635]: time="2025-09-12T17:57:00.633185786Z" level=info msg="Container 9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:00.640579 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3439503655.mount: Deactivated successfully. Sep 12 17:57:00.646737 containerd[1635]: time="2025-09-12T17:57:00.646690792Z" level=info msg="CreateContainer within sandbox \"8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14\"" Sep 12 17:57:00.648189 containerd[1635]: time="2025-09-12T17:57:00.648166109Z" level=info msg="StartContainer for \"9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14\"" Sep 12 17:57:00.649882 containerd[1635]: time="2025-09-12T17:57:00.649829240Z" level=info msg="connecting to shim 9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14" address="unix:///run/containerd/s/c04841d09263d34082bec4a11233ef6ef28ab428b7c2a788987935338b0d5946" protocol=ttrpc version=3 Sep 12 17:57:00.674189 systemd[1]: Started cri-containerd-9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14.scope - libcontainer container 9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14. Sep 12 17:57:00.716474 containerd[1635]: time="2025-09-12T17:57:00.716402573Z" level=info msg="StartContainer for \"9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14\" returns successfully" Sep 12 17:57:00.728839 systemd[1]: cri-containerd-9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14.scope: Deactivated successfully. Sep 12 17:57:00.748279 containerd[1635]: time="2025-09-12T17:57:00.748166331Z" level=info msg="received exit event container_id:\"9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14\" id:\"9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14\" pid:3624 exited_at:{seconds:1757699820 nanos:731595251}" Sep 12 17:57:00.754304 containerd[1635]: time="2025-09-12T17:57:00.754266958Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14\" id:\"9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14\" pid:3624 exited_at:{seconds:1757699820 nanos:731595251}" Sep 12 17:57:00.778298 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9ac08f894b38e07ba94c0aa66fb047225e463a12b99601effae1553d832c2d14-rootfs.mount: Deactivated successfully. Sep 12 17:57:00.870406 kubelet[2937]: E0912 17:57:00.870376 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wgskv" podUID="990867cb-2d26-443c-9869-c2147978654b" Sep 12 17:57:01.969646 containerd[1635]: time="2025-09-12T17:57:01.969527794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:57:02.870780 kubelet[2937]: E0912 17:57:02.870510 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wgskv" podUID="990867cb-2d26-443c-9869-c2147978654b" Sep 12 17:57:04.870972 kubelet[2937]: E0912 17:57:04.870941 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wgskv" podUID="990867cb-2d26-443c-9869-c2147978654b" Sep 12 17:57:05.656938 containerd[1635]: time="2025-09-12T17:57:05.656897628Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:05.657437 containerd[1635]: time="2025-09-12T17:57:05.657392647Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 17:57:05.658020 containerd[1635]: time="2025-09-12T17:57:05.657647125Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:05.658627 containerd[1635]: time="2025-09-12T17:57:05.658613515Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:05.659075 containerd[1635]: time="2025-09-12T17:57:05.659062398Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.688841268s" Sep 12 17:57:05.659477 containerd[1635]: time="2025-09-12T17:57:05.659169814Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 17:57:05.661553 containerd[1635]: time="2025-09-12T17:57:05.661529080Z" level=info msg="CreateContainer within sandbox \"8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:57:05.668135 containerd[1635]: time="2025-09-12T17:57:05.668101042Z" level=info msg="Container 1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:05.709962 containerd[1635]: time="2025-09-12T17:57:05.709924279Z" level=info msg="CreateContainer within sandbox \"8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb\"" Sep 12 17:57:05.710449 containerd[1635]: time="2025-09-12T17:57:05.710358098Z" level=info msg="StartContainer for \"1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb\"" Sep 12 17:57:05.712142 containerd[1635]: time="2025-09-12T17:57:05.712095481Z" level=info msg="connecting to shim 1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb" address="unix:///run/containerd/s/c04841d09263d34082bec4a11233ef6ef28ab428b7c2a788987935338b0d5946" protocol=ttrpc version=3 Sep 12 17:57:05.744371 systemd[1]: Started cri-containerd-1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb.scope - libcontainer container 1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb. Sep 12 17:57:05.784092 containerd[1635]: time="2025-09-12T17:57:05.784034604Z" level=info msg="StartContainer for \"1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb\" returns successfully" Sep 12 17:57:06.871102 kubelet[2937]: E0912 17:57:06.871061 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wgskv" podUID="990867cb-2d26-443c-9869-c2147978654b" Sep 12 17:57:08.039726 kubelet[2937]: I0912 17:57:08.039436 2937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:57:08.634176 systemd[1]: cri-containerd-1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb.scope: Deactivated successfully. Sep 12 17:57:08.634535 systemd[1]: cri-containerd-1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb.scope: Consumed 335ms CPU time, 165M memory peak, 2.2M read from disk, 171.3M written to disk. Sep 12 17:57:08.650091 containerd[1635]: time="2025-09-12T17:57:08.650066271Z" level=info msg="received exit event container_id:\"1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb\" id:\"1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb\" pid:3682 exited_at:{seconds:1757699828 nanos:649897824}" Sep 12 17:57:08.650601 containerd[1635]: time="2025-09-12T17:57:08.650586602Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb\" id:\"1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb\" pid:3682 exited_at:{seconds:1757699828 nanos:649897824}" Sep 12 17:57:08.717457 kubelet[2937]: I0912 17:57:08.717109 2937 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 17:57:08.721688 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1f85e6d31b518d96461deb3bc4089e8dcbb6506e3e65a52c09df7a35411e34fb-rootfs.mount: Deactivated successfully. Sep 12 17:57:08.784214 systemd[1]: Created slice kubepods-burstable-pod7d59a3f5_a1b8_4eb0_8ce5_1894e40ce7ce.slice - libcontainer container kubepods-burstable-pod7d59a3f5_a1b8_4eb0_8ce5_1894e40ce7ce.slice. Sep 12 17:57:08.789675 systemd[1]: Created slice kubepods-besteffort-pod4c28333a_ba24_4808_ac79_d81485d8d6a4.slice - libcontainer container kubepods-besteffort-pod4c28333a_ba24_4808_ac79_d81485d8d6a4.slice. Sep 12 17:57:08.822883 kubelet[2937]: I0912 17:57:08.822787 2937 status_manager.go:890] "Failed to get status for pod" podUID="7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce" pod="kube-system/coredns-668d6bf9bc-bbhqh" err="pods \"coredns-668d6bf9bc-bbhqh\" is forbidden: User \"system:node:localhost\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'localhost' and this object" Sep 12 17:57:08.825716 systemd[1]: Created slice kubepods-burstable-podfe63115b_e1de_47a4_b916_82af8ab911b5.slice - libcontainer container kubepods-burstable-podfe63115b_e1de_47a4_b916_82af8ab911b5.slice. Sep 12 17:57:08.864729 systemd[1]: Created slice kubepods-besteffort-pod6ffb3411_9b86_4776_866a_3e868e7a8ec5.slice - libcontainer container kubepods-besteffort-pod6ffb3411_9b86_4776_866a_3e868e7a8ec5.slice. Sep 12 17:57:08.870930 systemd[1]: Created slice kubepods-besteffort-pod2caee24f_516f_4681_8138_c9df56293371.slice - libcontainer container kubepods-besteffort-pod2caee24f_516f_4681_8138_c9df56293371.slice. Sep 12 17:57:08.875842 systemd[1]: Created slice kubepods-besteffort-pod393fa2be_7e95_4bab_a23d_441d0fb0527b.slice - libcontainer container kubepods-besteffort-pod393fa2be_7e95_4bab_a23d_441d0fb0527b.slice. Sep 12 17:57:08.879677 systemd[1]: Created slice kubepods-besteffort-pod9b00550e_49fd_4785_8727_5879ef59874c.slice - libcontainer container kubepods-besteffort-pod9b00550e_49fd_4785_8727_5879ef59874c.slice. Sep 12 17:57:08.883369 systemd[1]: Created slice kubepods-besteffort-pod990867cb_2d26_443c_9869_c2147978654b.slice - libcontainer container kubepods-besteffort-pod990867cb_2d26_443c_9869_c2147978654b.slice. Sep 12 17:57:08.890212 kubelet[2937]: I0912 17:57:08.883749 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd4qn\" (UniqueName: \"kubernetes.io/projected/7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce-kube-api-access-xd4qn\") pod \"coredns-668d6bf9bc-bbhqh\" (UID: \"7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce\") " pod="kube-system/coredns-668d6bf9bc-bbhqh" Sep 12 17:57:08.890212 kubelet[2937]: I0912 17:57:08.883767 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/393fa2be-7e95-4bab-a23d-441d0fb0527b-calico-apiserver-certs\") pod \"calico-apiserver-86cb648869-l49nt\" (UID: \"393fa2be-7e95-4bab-a23d-441d0fb0527b\") " pod="calico-apiserver/calico-apiserver-86cb648869-l49nt" Sep 12 17:57:08.890212 kubelet[2937]: I0912 17:57:08.883780 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2caee24f-516f-4681-8138-c9df56293371-config\") pod \"goldmane-54d579b49d-wp8q7\" (UID: \"2caee24f-516f-4681-8138-c9df56293371\") " pod="calico-system/goldmane-54d579b49d-wp8q7" Sep 12 17:57:08.890212 kubelet[2937]: I0912 17:57:08.883789 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2caee24f-516f-4681-8138-c9df56293371-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-wp8q7\" (UID: \"2caee24f-516f-4681-8138-c9df56293371\") " pod="calico-system/goldmane-54d579b49d-wp8q7" Sep 12 17:57:08.890212 kubelet[2937]: I0912 17:57:08.883799 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mg89\" (UniqueName: \"kubernetes.io/projected/9b00550e-49fd-4785-8727-5879ef59874c-kube-api-access-8mg89\") pod \"whisker-55664b5c97-5hr8h\" (UID: \"9b00550e-49fd-4785-8727-5879ef59874c\") " pod="calico-system/whisker-55664b5c97-5hr8h" Sep 12 17:57:08.890319 kubelet[2937]: I0912 17:57:08.883810 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c28333a-ba24-4808-ac79-d81485d8d6a4-tigera-ca-bundle\") pod \"calico-kube-controllers-65fb48f58b-tnhtz\" (UID: \"4c28333a-ba24-4808-ac79-d81485d8d6a4\") " pod="calico-system/calico-kube-controllers-65fb48f58b-tnhtz" Sep 12 17:57:08.890319 kubelet[2937]: I0912 17:57:08.883820 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlznr\" (UniqueName: \"kubernetes.io/projected/4c28333a-ba24-4808-ac79-d81485d8d6a4-kube-api-access-jlznr\") pod \"calico-kube-controllers-65fb48f58b-tnhtz\" (UID: \"4c28333a-ba24-4808-ac79-d81485d8d6a4\") " pod="calico-system/calico-kube-controllers-65fb48f58b-tnhtz" Sep 12 17:57:08.890319 kubelet[2937]: I0912 17:57:08.883830 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9b00550e-49fd-4785-8727-5879ef59874c-whisker-backend-key-pair\") pod \"whisker-55664b5c97-5hr8h\" (UID: \"9b00550e-49fd-4785-8727-5879ef59874c\") " pod="calico-system/whisker-55664b5c97-5hr8h" Sep 12 17:57:08.890319 kubelet[2937]: I0912 17:57:08.883841 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkxzm\" (UniqueName: \"kubernetes.io/projected/6ffb3411-9b86-4776-866a-3e868e7a8ec5-kube-api-access-kkxzm\") pod \"calico-apiserver-86cb648869-dcmss\" (UID: \"6ffb3411-9b86-4776-866a-3e868e7a8ec5\") " pod="calico-apiserver/calico-apiserver-86cb648869-dcmss" Sep 12 17:57:08.890319 kubelet[2937]: I0912 17:57:08.883852 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe63115b-e1de-47a4-b916-82af8ab911b5-config-volume\") pod \"coredns-668d6bf9bc-d6bfv\" (UID: \"fe63115b-e1de-47a4-b916-82af8ab911b5\") " pod="kube-system/coredns-668d6bf9bc-d6bfv" Sep 12 17:57:08.890406 kubelet[2937]: I0912 17:57:08.883863 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce-config-volume\") pod \"coredns-668d6bf9bc-bbhqh\" (UID: \"7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce\") " pod="kube-system/coredns-668d6bf9bc-bbhqh" Sep 12 17:57:08.890406 kubelet[2937]: I0912 17:57:08.883875 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnvcm\" (UniqueName: \"kubernetes.io/projected/fe63115b-e1de-47a4-b916-82af8ab911b5-kube-api-access-dnvcm\") pod \"coredns-668d6bf9bc-d6bfv\" (UID: \"fe63115b-e1de-47a4-b916-82af8ab911b5\") " pod="kube-system/coredns-668d6bf9bc-d6bfv" Sep 12 17:57:08.890406 kubelet[2937]: I0912 17:57:08.883885 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhjqg\" (UniqueName: \"kubernetes.io/projected/393fa2be-7e95-4bab-a23d-441d0fb0527b-kube-api-access-xhjqg\") pod \"calico-apiserver-86cb648869-l49nt\" (UID: \"393fa2be-7e95-4bab-a23d-441d0fb0527b\") " pod="calico-apiserver/calico-apiserver-86cb648869-l49nt" Sep 12 17:57:08.890406 kubelet[2937]: I0912 17:57:08.883894 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b00550e-49fd-4785-8727-5879ef59874c-whisker-ca-bundle\") pod \"whisker-55664b5c97-5hr8h\" (UID: \"9b00550e-49fd-4785-8727-5879ef59874c\") " pod="calico-system/whisker-55664b5c97-5hr8h" Sep 12 17:57:08.890406 kubelet[2937]: I0912 17:57:08.883905 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6ffb3411-9b86-4776-866a-3e868e7a8ec5-calico-apiserver-certs\") pod \"calico-apiserver-86cb648869-dcmss\" (UID: \"6ffb3411-9b86-4776-866a-3e868e7a8ec5\") " pod="calico-apiserver/calico-apiserver-86cb648869-dcmss" Sep 12 17:57:08.890492 kubelet[2937]: I0912 17:57:08.883914 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2caee24f-516f-4681-8138-c9df56293371-goldmane-key-pair\") pod \"goldmane-54d579b49d-wp8q7\" (UID: \"2caee24f-516f-4681-8138-c9df56293371\") " pod="calico-system/goldmane-54d579b49d-wp8q7" Sep 12 17:57:08.890492 kubelet[2937]: I0912 17:57:08.883922 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7qr\" (UniqueName: \"kubernetes.io/projected/2caee24f-516f-4681-8138-c9df56293371-kube-api-access-zl7qr\") pod \"goldmane-54d579b49d-wp8q7\" (UID: \"2caee24f-516f-4681-8138-c9df56293371\") " pod="calico-system/goldmane-54d579b49d-wp8q7" Sep 12 17:57:08.901073 containerd[1635]: time="2025-09-12T17:57:08.901035498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wgskv,Uid:990867cb-2d26-443c-9869-c2147978654b,Namespace:calico-system,Attempt:0,}" Sep 12 17:57:09.027456 containerd[1635]: time="2025-09-12T17:57:09.027427293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:57:09.173864 containerd[1635]: time="2025-09-12T17:57:09.173722263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wp8q7,Uid:2caee24f-516f-4681-8138-c9df56293371,Namespace:calico-system,Attempt:0,}" Sep 12 17:57:09.177937 containerd[1635]: time="2025-09-12T17:57:09.177901558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cb648869-l49nt,Uid:393fa2be-7e95-4bab-a23d-441d0fb0527b,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:57:09.182484 containerd[1635]: time="2025-09-12T17:57:09.182383759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-55664b5c97-5hr8h,Uid:9b00550e-49fd-4785-8727-5879ef59874c,Namespace:calico-system,Attempt:0,}" Sep 12 17:57:09.387472 containerd[1635]: time="2025-09-12T17:57:09.387437458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bbhqh,Uid:7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce,Namespace:kube-system,Attempt:0,}" Sep 12 17:57:09.423300 containerd[1635]: time="2025-09-12T17:57:09.423258187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65fb48f58b-tnhtz,Uid:4c28333a-ba24-4808-ac79-d81485d8d6a4,Namespace:calico-system,Attempt:0,}" Sep 12 17:57:09.462651 containerd[1635]: time="2025-09-12T17:57:09.462579815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-d6bfv,Uid:fe63115b-e1de-47a4-b916-82af8ab911b5,Namespace:kube-system,Attempt:0,}" Sep 12 17:57:09.467677 containerd[1635]: time="2025-09-12T17:57:09.467640898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cb648869-dcmss,Uid:6ffb3411-9b86-4776-866a-3e868e7a8ec5,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:57:09.851292 containerd[1635]: time="2025-09-12T17:57:09.849148361Z" level=error msg="Failed to destroy network for sandbox \"325d88333f37a9ae65ebf1aa16ff31987c2365cccd8f2093453fc5f2860c19f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.850705 systemd[1]: run-netns-cni\x2d282cdd4a\x2d1fa3\x2d56ee\x2d6392\x2d95f944b21895.mount: Deactivated successfully. Sep 12 17:57:09.864048 containerd[1635]: time="2025-09-12T17:57:09.854407939Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bbhqh,Uid:7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"325d88333f37a9ae65ebf1aa16ff31987c2365cccd8f2093453fc5f2860c19f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.882241 kubelet[2937]: E0912 17:57:09.882123 2937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"325d88333f37a9ae65ebf1aa16ff31987c2365cccd8f2093453fc5f2860c19f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.885874 kubelet[2937]: E0912 17:57:09.885492 2937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"325d88333f37a9ae65ebf1aa16ff31987c2365cccd8f2093453fc5f2860c19f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bbhqh" Sep 12 17:57:09.885874 kubelet[2937]: E0912 17:57:09.885518 2937 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"325d88333f37a9ae65ebf1aa16ff31987c2365cccd8f2093453fc5f2860c19f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bbhqh" Sep 12 17:57:09.893507 containerd[1635]: time="2025-09-12T17:57:09.893475772Z" level=error msg="Failed to destroy network for sandbox \"c6fe77d5b042ffb97a7f440d133e7b04e0fb2043a7cec82784af38b31a29ac59\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.896315 systemd[1]: run-netns-cni\x2d2ef16e95\x2d4f13\x2dd885\x2d7618\x2d1e4882189349.mount: Deactivated successfully. Sep 12 17:57:09.899953 containerd[1635]: time="2025-09-12T17:57:09.896678607Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cb648869-dcmss,Uid:6ffb3411-9b86-4776-866a-3e868e7a8ec5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6fe77d5b042ffb97a7f440d133e7b04e0fb2043a7cec82784af38b31a29ac59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.899953 containerd[1635]: time="2025-09-12T17:57:09.896809378Z" level=error msg="Failed to destroy network for sandbox \"2321b93092fb245ac165f49f811ca0e074a72f18b272fe4821e32b9c60ee8976\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.898299 systemd[1]: run-netns-cni\x2d858bfaa7\x2dc057\x2d73fa\x2d4cdb\x2dfe8180da4c8b.mount: Deactivated successfully. Sep 12 17:57:09.900337 kubelet[2937]: E0912 17:57:09.900310 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-bbhqh_kube-system(7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-bbhqh_kube-system(7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"325d88333f37a9ae65ebf1aa16ff31987c2365cccd8f2093453fc5f2860c19f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-bbhqh" podUID="7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce" Sep 12 17:57:09.901550 containerd[1635]: time="2025-09-12T17:57:09.901522273Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65fb48f58b-tnhtz,Uid:4c28333a-ba24-4808-ac79-d81485d8d6a4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2321b93092fb245ac165f49f811ca0e074a72f18b272fe4821e32b9c60ee8976\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.901625 containerd[1635]: time="2025-09-12T17:57:09.901608533Z" level=error msg="Failed to destroy network for sandbox \"84116e663bba110121ea6e4c18274376c9f1f2c2fd0f00dfc6c7d0d5b9978509\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.902747 systemd[1]: run-netns-cni\x2da57d89d0\x2de734\x2d8fcf\x2d08be\x2d553916abd7af.mount: Deactivated successfully. Sep 12 17:57:09.906856 containerd[1635]: time="2025-09-12T17:57:09.906830238Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wp8q7,Uid:2caee24f-516f-4681-8138-c9df56293371,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"84116e663bba110121ea6e4c18274376c9f1f2c2fd0f00dfc6c7d0d5b9978509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.906923 containerd[1635]: time="2025-09-12T17:57:09.906897464Z" level=error msg="Failed to destroy network for sandbox \"69eb8fcb99b7acfc1d709f447f960c07dc22f49377054bde2d8b12787a1eb29d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.908150 containerd[1635]: time="2025-09-12T17:57:09.907829105Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cb648869-l49nt,Uid:393fa2be-7e95-4bab-a23d-441d0fb0527b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"69eb8fcb99b7acfc1d709f447f960c07dc22f49377054bde2d8b12787a1eb29d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.908150 containerd[1635]: time="2025-09-12T17:57:09.907921812Z" level=error msg="Failed to destroy network for sandbox \"c58e138354500eb3e55a008da9096852c829718fb7a04de9e4ffaf2d0490940d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.908671 containerd[1635]: time="2025-09-12T17:57:09.908395840Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-d6bfv,Uid:fe63115b-e1de-47a4-b916-82af8ab911b5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c58e138354500eb3e55a008da9096852c829718fb7a04de9e4ffaf2d0490940d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.908671 containerd[1635]: time="2025-09-12T17:57:09.908450574Z" level=error msg="Failed to destroy network for sandbox \"acd443c03cfa5190391b50b2b43dde604965b72716bfc83d951ba74aa4c7e414\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.909517 containerd[1635]: time="2025-09-12T17:57:09.909479620Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wgskv,Uid:990867cb-2d26-443c-9869-c2147978654b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"acd443c03cfa5190391b50b2b43dde604965b72716bfc83d951ba74aa4c7e414\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.909565 containerd[1635]: time="2025-09-12T17:57:09.909543667Z" level=error msg="Failed to destroy network for sandbox \"ebc4bed04f9a77233034781fa0a71b7c2e99a70841494989bd7b65a56d55d262\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.909711 kubelet[2937]: E0912 17:57:09.909685 2937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69eb8fcb99b7acfc1d709f447f960c07dc22f49377054bde2d8b12787a1eb29d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.910247 kubelet[2937]: E0912 17:57:09.909782 2937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6fe77d5b042ffb97a7f440d133e7b04e0fb2043a7cec82784af38b31a29ac59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.910247 kubelet[2937]: E0912 17:57:09.909961 2937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6fe77d5b042ffb97a7f440d133e7b04e0fb2043a7cec82784af38b31a29ac59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86cb648869-dcmss" Sep 12 17:57:09.910247 kubelet[2937]: E0912 17:57:09.909979 2937 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6fe77d5b042ffb97a7f440d133e7b04e0fb2043a7cec82784af38b31a29ac59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86cb648869-dcmss" Sep 12 17:57:09.910247 kubelet[2937]: E0912 17:57:09.910093 2937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69eb8fcb99b7acfc1d709f447f960c07dc22f49377054bde2d8b12787a1eb29d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86cb648869-l49nt" Sep 12 17:57:09.910358 kubelet[2937]: E0912 17:57:09.910106 2937 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69eb8fcb99b7acfc1d709f447f960c07dc22f49377054bde2d8b12787a1eb29d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86cb648869-l49nt" Sep 12 17:57:09.910358 kubelet[2937]: E0912 17:57:09.910131 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-86cb648869-l49nt_calico-apiserver(393fa2be-7e95-4bab-a23d-441d0fb0527b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-86cb648869-l49nt_calico-apiserver(393fa2be-7e95-4bab-a23d-441d0fb0527b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69eb8fcb99b7acfc1d709f447f960c07dc22f49377054bde2d8b12787a1eb29d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86cb648869-l49nt" podUID="393fa2be-7e95-4bab-a23d-441d0fb0527b" Sep 12 17:57:09.910358 kubelet[2937]: E0912 17:57:09.909803 2937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2321b93092fb245ac165f49f811ca0e074a72f18b272fe4821e32b9c60ee8976\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.910452 kubelet[2937]: E0912 17:57:09.910157 2937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2321b93092fb245ac165f49f811ca0e074a72f18b272fe4821e32b9c60ee8976\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65fb48f58b-tnhtz" Sep 12 17:57:09.910452 kubelet[2937]: E0912 17:57:09.910165 2937 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2321b93092fb245ac165f49f811ca0e074a72f18b272fe4821e32b9c60ee8976\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65fb48f58b-tnhtz" Sep 12 17:57:09.910452 kubelet[2937]: E0912 17:57:09.910182 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-65fb48f58b-tnhtz_calico-system(4c28333a-ba24-4808-ac79-d81485d8d6a4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-65fb48f58b-tnhtz_calico-system(4c28333a-ba24-4808-ac79-d81485d8d6a4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2321b93092fb245ac165f49f811ca0e074a72f18b272fe4821e32b9c60ee8976\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-65fb48f58b-tnhtz" podUID="4c28333a-ba24-4808-ac79-d81485d8d6a4" Sep 12 17:57:09.910525 kubelet[2937]: E0912 17:57:09.909812 2937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84116e663bba110121ea6e4c18274376c9f1f2c2fd0f00dfc6c7d0d5b9978509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.910525 kubelet[2937]: E0912 17:57:09.910208 2937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84116e663bba110121ea6e4c18274376c9f1f2c2fd0f00dfc6c7d0d5b9978509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-wp8q7" Sep 12 17:57:09.910525 kubelet[2937]: E0912 17:57:09.910218 2937 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84116e663bba110121ea6e4c18274376c9f1f2c2fd0f00dfc6c7d0d5b9978509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-wp8q7" Sep 12 17:57:09.910579 kubelet[2937]: E0912 17:57:09.910231 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-wp8q7_calico-system(2caee24f-516f-4681-8138-c9df56293371)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-wp8q7_calico-system(2caee24f-516f-4681-8138-c9df56293371)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"84116e663bba110121ea6e4c18274376c9f1f2c2fd0f00dfc6c7d0d5b9978509\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-wp8q7" podUID="2caee24f-516f-4681-8138-c9df56293371" Sep 12 17:57:09.911067 kubelet[2937]: E0912 17:57:09.911050 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-86cb648869-dcmss_calico-apiserver(6ffb3411-9b86-4776-866a-3e868e7a8ec5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-86cb648869-dcmss_calico-apiserver(6ffb3411-9b86-4776-866a-3e868e7a8ec5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c6fe77d5b042ffb97a7f440d133e7b04e0fb2043a7cec82784af38b31a29ac59\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86cb648869-dcmss" podUID="6ffb3411-9b86-4776-866a-3e868e7a8ec5" Sep 12 17:57:09.911338 kubelet[2937]: E0912 17:57:09.911146 2937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c58e138354500eb3e55a008da9096852c829718fb7a04de9e4ffaf2d0490940d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.911338 kubelet[2937]: E0912 17:57:09.911169 2937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c58e138354500eb3e55a008da9096852c829718fb7a04de9e4ffaf2d0490940d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-d6bfv" Sep 12 17:57:09.911338 kubelet[2937]: E0912 17:57:09.911184 2937 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c58e138354500eb3e55a008da9096852c829718fb7a04de9e4ffaf2d0490940d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-d6bfv" Sep 12 17:57:09.911429 kubelet[2937]: E0912 17:57:09.911202 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-d6bfv_kube-system(fe63115b-e1de-47a4-b916-82af8ab911b5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-d6bfv_kube-system(fe63115b-e1de-47a4-b916-82af8ab911b5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c58e138354500eb3e55a008da9096852c829718fb7a04de9e4ffaf2d0490940d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-d6bfv" podUID="fe63115b-e1de-47a4-b916-82af8ab911b5" Sep 12 17:57:09.911429 kubelet[2937]: E0912 17:57:09.911225 2937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acd443c03cfa5190391b50b2b43dde604965b72716bfc83d951ba74aa4c7e414\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.911429 kubelet[2937]: E0912 17:57:09.911235 2937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acd443c03cfa5190391b50b2b43dde604965b72716bfc83d951ba74aa4c7e414\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-wgskv" Sep 12 17:57:09.911499 kubelet[2937]: E0912 17:57:09.911243 2937 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acd443c03cfa5190391b50b2b43dde604965b72716bfc83d951ba74aa4c7e414\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-wgskv" Sep 12 17:57:09.912634 containerd[1635]: time="2025-09-12T17:57:09.912611219Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-55664b5c97-5hr8h,Uid:9b00550e-49fd-4785-8727-5879ef59874c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebc4bed04f9a77233034781fa0a71b7c2e99a70841494989bd7b65a56d55d262\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.917907 kubelet[2937]: E0912 17:57:09.911259 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-wgskv_calico-system(990867cb-2d26-443c-9869-c2147978654b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-wgskv_calico-system(990867cb-2d26-443c-9869-c2147978654b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"acd443c03cfa5190391b50b2b43dde604965b72716bfc83d951ba74aa4c7e414\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-wgskv" podUID="990867cb-2d26-443c-9869-c2147978654b" Sep 12 17:57:09.919023 kubelet[2937]: E0912 17:57:09.918947 2937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebc4bed04f9a77233034781fa0a71b7c2e99a70841494989bd7b65a56d55d262\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:57:09.919023 kubelet[2937]: E0912 17:57:09.918975 2937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebc4bed04f9a77233034781fa0a71b7c2e99a70841494989bd7b65a56d55d262\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-55664b5c97-5hr8h" Sep 12 17:57:09.919023 kubelet[2937]: E0912 17:57:09.918986 2937 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebc4bed04f9a77233034781fa0a71b7c2e99a70841494989bd7b65a56d55d262\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-55664b5c97-5hr8h" Sep 12 17:57:09.919152 kubelet[2937]: E0912 17:57:09.919129 2937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-55664b5c97-5hr8h_calico-system(9b00550e-49fd-4785-8727-5879ef59874c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-55664b5c97-5hr8h_calico-system(9b00550e-49fd-4785-8727-5879ef59874c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ebc4bed04f9a77233034781fa0a71b7c2e99a70841494989bd7b65a56d55d262\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-55664b5c97-5hr8h" podUID="9b00550e-49fd-4785-8727-5879ef59874c" Sep 12 17:57:10.721559 systemd[1]: run-netns-cni\x2d34783fb2\x2d253e\x2d6210\x2d303a\x2dd3aea9e2bb50.mount: Deactivated successfully. Sep 12 17:57:10.721614 systemd[1]: run-netns-cni\x2ddfeb9f31\x2d1e7b\x2da2ee\x2dbe5a\x2d30973972ea31.mount: Deactivated successfully. Sep 12 17:57:10.721649 systemd[1]: run-netns-cni\x2d8ad22ab5\x2dc08c\x2d8d13\x2d829f\x2d862589834b13.mount: Deactivated successfully. Sep 12 17:57:10.721686 systemd[1]: run-netns-cni\x2dbb3b6460\x2d92a4\x2d06c7\x2d21fa\x2dd7bcd9a62506.mount: Deactivated successfully. Sep 12 17:57:14.243562 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4135830820.mount: Deactivated successfully. Sep 12 17:57:15.197069 containerd[1635]: time="2025-09-12T17:57:15.165710816Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:15.233447 containerd[1635]: time="2025-09-12T17:57:15.233318782Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 17:57:15.258038 containerd[1635]: time="2025-09-12T17:57:15.257989148Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:15.286474 containerd[1635]: time="2025-09-12T17:57:15.286418673Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:15.290598 containerd[1635]: time="2025-09-12T17:57:15.290574397Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.261909773s" Sep 12 17:57:15.290743 containerd[1635]: time="2025-09-12T17:57:15.290599230Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 17:57:15.377548 containerd[1635]: time="2025-09-12T17:57:15.377515791Z" level=info msg="CreateContainer within sandbox \"8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:57:15.423521 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1654886536.mount: Deactivated successfully. Sep 12 17:57:15.425620 containerd[1635]: time="2025-09-12T17:57:15.424060365Z" level=info msg="Container 7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:15.513634 containerd[1635]: time="2025-09-12T17:57:15.513343514Z" level=info msg="CreateContainer within sandbox \"8820dae51513f73306e1ce146bd602db24631d39263e270d0375da015de608a5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda\"" Sep 12 17:57:15.514274 containerd[1635]: time="2025-09-12T17:57:15.514257085Z" level=info msg="StartContainer for \"7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda\"" Sep 12 17:57:15.516813 containerd[1635]: time="2025-09-12T17:57:15.516779046Z" level=info msg="connecting to shim 7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda" address="unix:///run/containerd/s/c04841d09263d34082bec4a11233ef6ef28ab428b7c2a788987935338b0d5946" protocol=ttrpc version=3 Sep 12 17:57:15.709159 systemd[1]: Started cri-containerd-7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda.scope - libcontainer container 7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda. Sep 12 17:57:15.761627 containerd[1635]: time="2025-09-12T17:57:15.761223456Z" level=info msg="StartContainer for \"7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda\" returns successfully" Sep 12 17:57:15.941620 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:57:15.943672 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:57:16.156198 kubelet[2937]: I0912 17:57:16.155072 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-m5kxk" podStartSLOduration=2.098423457 podStartE2EDuration="22.155061181s" podCreationTimestamp="2025-09-12 17:56:54 +0000 UTC" firstStartedPulling="2025-09-12 17:56:55.236086507 +0000 UTC m=+17.507220853" lastFinishedPulling="2025-09-12 17:57:15.292724235 +0000 UTC m=+37.563858577" observedRunningTime="2025-09-12 17:57:16.15491012 +0000 UTC m=+38.426044474" watchObservedRunningTime="2025-09-12 17:57:16.155061181 +0000 UTC m=+38.426195533" Sep 12 17:57:16.272846 containerd[1635]: time="2025-09-12T17:57:16.272658310Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda\" id:\"b05842f8cb1badb78f875bbc3125728f55b63b33f380954b02c834c4fba73104\" pid:4018 exit_status:1 exited_at:{seconds:1757699836 nanos:272310994}" Sep 12 17:57:16.549075 kubelet[2937]: I0912 17:57:16.548939 2937 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mg89\" (UniqueName: \"kubernetes.io/projected/9b00550e-49fd-4785-8727-5879ef59874c-kube-api-access-8mg89\") pod \"9b00550e-49fd-4785-8727-5879ef59874c\" (UID: \"9b00550e-49fd-4785-8727-5879ef59874c\") " Sep 12 17:57:16.549075 kubelet[2937]: I0912 17:57:16.548970 2937 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9b00550e-49fd-4785-8727-5879ef59874c-whisker-backend-key-pair\") pod \"9b00550e-49fd-4785-8727-5879ef59874c\" (UID: \"9b00550e-49fd-4785-8727-5879ef59874c\") " Sep 12 17:57:16.549543 kubelet[2937]: I0912 17:57:16.549534 2937 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b00550e-49fd-4785-8727-5879ef59874c-whisker-ca-bundle\") pod \"9b00550e-49fd-4785-8727-5879ef59874c\" (UID: \"9b00550e-49fd-4785-8727-5879ef59874c\") " Sep 12 17:57:16.549773 kubelet[2937]: I0912 17:57:16.549762 2937 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b00550e-49fd-4785-8727-5879ef59874c-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "9b00550e-49fd-4785-8727-5879ef59874c" (UID: "9b00550e-49fd-4785-8727-5879ef59874c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 17:57:16.561456 systemd[1]: var-lib-kubelet-pods-9b00550e\x2d49fd\x2d4785\x2d8727\x2d5879ef59874c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:57:16.561712 systemd[1]: var-lib-kubelet-pods-9b00550e\x2d49fd\x2d4785\x2d8727\x2d5879ef59874c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8mg89.mount: Deactivated successfully. Sep 12 17:57:16.562916 kubelet[2937]: I0912 17:57:16.562879 2937 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b00550e-49fd-4785-8727-5879ef59874c-kube-api-access-8mg89" (OuterVolumeSpecName: "kube-api-access-8mg89") pod "9b00550e-49fd-4785-8727-5879ef59874c" (UID: "9b00550e-49fd-4785-8727-5879ef59874c"). InnerVolumeSpecName "kube-api-access-8mg89". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 17:57:16.563016 kubelet[2937]: I0912 17:57:16.562996 2937 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b00550e-49fd-4785-8727-5879ef59874c-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "9b00550e-49fd-4785-8727-5879ef59874c" (UID: "9b00550e-49fd-4785-8727-5879ef59874c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 17:57:16.650232 kubelet[2937]: I0912 17:57:16.650191 2937 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mg89\" (UniqueName: \"kubernetes.io/projected/9b00550e-49fd-4785-8727-5879ef59874c-kube-api-access-8mg89\") on node \"localhost\" DevicePath \"\"" Sep 12 17:57:16.650232 kubelet[2937]: I0912 17:57:16.650210 2937 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9b00550e-49fd-4785-8727-5879ef59874c-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 17:57:16.650232 kubelet[2937]: I0912 17:57:16.650217 2937 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b00550e-49fd-4785-8727-5879ef59874c-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 17:57:17.080269 systemd[1]: Removed slice kubepods-besteffort-pod9b00550e_49fd_4785_8727_5879ef59874c.slice - libcontainer container kubepods-besteffort-pod9b00550e_49fd_4785_8727_5879ef59874c.slice. Sep 12 17:57:17.142639 systemd[1]: Created slice kubepods-besteffort-pod87b9cb46_9eee_42ae_b6d3_d92bdbec68ab.slice - libcontainer container kubepods-besteffort-pod87b9cb46_9eee_42ae_b6d3_d92bdbec68ab.slice. Sep 12 17:57:17.175494 containerd[1635]: time="2025-09-12T17:57:17.175391363Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda\" id:\"1754f28b62c3260eb9083e1593817ae3c4346b254a5178c6360ae883a791a170\" pid:4055 exit_status:1 exited_at:{seconds:1757699837 nanos:171091990}" Sep 12 17:57:17.254484 kubelet[2937]: I0912 17:57:17.254240 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5sw\" (UniqueName: \"kubernetes.io/projected/87b9cb46-9eee-42ae-b6d3-d92bdbec68ab-kube-api-access-7f5sw\") pod \"whisker-7558f9cdf6-hpjpl\" (UID: \"87b9cb46-9eee-42ae-b6d3-d92bdbec68ab\") " pod="calico-system/whisker-7558f9cdf6-hpjpl" Sep 12 17:57:17.254484 kubelet[2937]: I0912 17:57:17.254271 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87b9cb46-9eee-42ae-b6d3-d92bdbec68ab-whisker-ca-bundle\") pod \"whisker-7558f9cdf6-hpjpl\" (UID: \"87b9cb46-9eee-42ae-b6d3-d92bdbec68ab\") " pod="calico-system/whisker-7558f9cdf6-hpjpl" Sep 12 17:57:17.254484 kubelet[2937]: I0912 17:57:17.254303 2937 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/87b9cb46-9eee-42ae-b6d3-d92bdbec68ab-whisker-backend-key-pair\") pod \"whisker-7558f9cdf6-hpjpl\" (UID: \"87b9cb46-9eee-42ae-b6d3-d92bdbec68ab\") " pod="calico-system/whisker-7558f9cdf6-hpjpl" Sep 12 17:57:17.447087 containerd[1635]: time="2025-09-12T17:57:17.446437770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7558f9cdf6-hpjpl,Uid:87b9cb46-9eee-42ae-b6d3-d92bdbec68ab,Namespace:calico-system,Attempt:0,}" Sep 12 17:57:17.875471 kubelet[2937]: I0912 17:57:17.875346 2937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b00550e-49fd-4785-8727-5879ef59874c" path="/var/lib/kubelet/pods/9b00550e-49fd-4785-8727-5879ef59874c/volumes" Sep 12 17:57:17.909614 systemd-networkd[1522]: vxlan.calico: Link UP Sep 12 17:57:17.909619 systemd-networkd[1522]: vxlan.calico: Gained carrier Sep 12 17:57:18.038033 systemd-networkd[1522]: cali7209bc223c3: Link UP Sep 12 17:57:18.039533 systemd-networkd[1522]: cali7209bc223c3: Gained carrier Sep 12 17:57:18.056759 containerd[1635]: 2025-09-12 17:57:17.481 [INFO][4089] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:57:18.056759 containerd[1635]: 2025-09-12 17:57:17.552 [INFO][4089] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0 whisker-7558f9cdf6- calico-system 87b9cb46-9eee-42ae-b6d3-d92bdbec68ab 867 0 2025-09-12 17:57:17 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7558f9cdf6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-7558f9cdf6-hpjpl eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali7209bc223c3 [] [] }} ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Namespace="calico-system" Pod="whisker-7558f9cdf6-hpjpl" WorkloadEndpoint="localhost-k8s-whisker--7558f9cdf6--hpjpl-" Sep 12 17:57:18.056759 containerd[1635]: 2025-09-12 17:57:17.552 [INFO][4089] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Namespace="calico-system" Pod="whisker-7558f9cdf6-hpjpl" WorkloadEndpoint="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" Sep 12 17:57:18.056759 containerd[1635]: 2025-09-12 17:57:17.932 [INFO][4166] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" HandleID="k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Workload="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:17.938 [INFO][4166] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" HandleID="k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Workload="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000604160), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-7558f9cdf6-hpjpl", "timestamp":"2025-09-12 17:57:17.932712223 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:17.938 [INFO][4166] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:17.939 [INFO][4166] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:17.939 [INFO][4166] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:17.977 [INFO][4166] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" host="localhost" Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:17.994 [INFO][4166] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:17.999 [INFO][4166] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:18.001 [INFO][4166] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:18.004 [INFO][4166] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:18.058578 containerd[1635]: 2025-09-12 17:57:18.004 [INFO][4166] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" host="localhost" Sep 12 17:57:18.059289 containerd[1635]: 2025-09-12 17:57:18.007 [INFO][4166] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f Sep 12 17:57:18.059289 containerd[1635]: 2025-09-12 17:57:18.012 [INFO][4166] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" host="localhost" Sep 12 17:57:18.059289 containerd[1635]: 2025-09-12 17:57:18.018 [INFO][4166] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" host="localhost" Sep 12 17:57:18.059289 containerd[1635]: 2025-09-12 17:57:18.018 [INFO][4166] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" host="localhost" Sep 12 17:57:18.059289 containerd[1635]: 2025-09-12 17:57:18.018 [INFO][4166] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:57:18.059289 containerd[1635]: 2025-09-12 17:57:18.018 [INFO][4166] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" HandleID="k8s-pod-network.c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Workload="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" Sep 12 17:57:18.059399 containerd[1635]: 2025-09-12 17:57:18.025 [INFO][4089] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Namespace="calico-system" Pod="whisker-7558f9cdf6-hpjpl" WorkloadEndpoint="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0", GenerateName:"whisker-7558f9cdf6-", Namespace:"calico-system", SelfLink:"", UID:"87b9cb46-9eee-42ae-b6d3-d92bdbec68ab", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 57, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7558f9cdf6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-7558f9cdf6-hpjpl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7209bc223c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:18.059399 containerd[1635]: 2025-09-12 17:57:18.025 [INFO][4089] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Namespace="calico-system" Pod="whisker-7558f9cdf6-hpjpl" WorkloadEndpoint="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" Sep 12 17:57:18.059484 containerd[1635]: 2025-09-12 17:57:18.025 [INFO][4089] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7209bc223c3 ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Namespace="calico-system" Pod="whisker-7558f9cdf6-hpjpl" WorkloadEndpoint="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" Sep 12 17:57:18.059484 containerd[1635]: 2025-09-12 17:57:18.042 [INFO][4089] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Namespace="calico-system" Pod="whisker-7558f9cdf6-hpjpl" WorkloadEndpoint="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" Sep 12 17:57:18.059634 containerd[1635]: 2025-09-12 17:57:18.044 [INFO][4089] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Namespace="calico-system" Pod="whisker-7558f9cdf6-hpjpl" WorkloadEndpoint="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0", GenerateName:"whisker-7558f9cdf6-", Namespace:"calico-system", SelfLink:"", UID:"87b9cb46-9eee-42ae-b6d3-d92bdbec68ab", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 57, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7558f9cdf6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f", Pod:"whisker-7558f9cdf6-hpjpl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7209bc223c3", MAC:"5e:c0:a8:80:7c:cc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:18.060083 containerd[1635]: 2025-09-12 17:57:18.052 [INFO][4089] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" Namespace="calico-system" Pod="whisker-7558f9cdf6-hpjpl" WorkloadEndpoint="localhost-k8s-whisker--7558f9cdf6--hpjpl-eth0" Sep 12 17:57:18.150571 containerd[1635]: time="2025-09-12T17:57:18.150432451Z" level=info msg="connecting to shim c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f" address="unix:///run/containerd/s/b2f6c2b6aeadfcb8f3e07f2b1a1c9d777e6efd1b9a84d8f34dfb5fd39aea9eb5" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:57:18.174207 systemd[1]: Started cri-containerd-c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f.scope - libcontainer container c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f. Sep 12 17:57:18.198818 systemd-resolved[1523]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 17:57:18.259794 containerd[1635]: time="2025-09-12T17:57:18.259700146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7558f9cdf6-hpjpl,Uid:87b9cb46-9eee-42ae-b6d3-d92bdbec68ab,Namespace:calico-system,Attempt:0,} returns sandbox id \"c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f\"" Sep 12 17:57:18.276657 containerd[1635]: time="2025-09-12T17:57:18.276557896Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:57:19.019148 systemd-networkd[1522]: vxlan.calico: Gained IPv6LL Sep 12 17:57:19.676524 containerd[1635]: time="2025-09-12T17:57:19.676490281Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:19.676981 containerd[1635]: time="2025-09-12T17:57:19.676966179Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 17:57:19.677545 containerd[1635]: time="2025-09-12T17:57:19.677528104Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:19.678746 containerd[1635]: time="2025-09-12T17:57:19.678723762Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:19.679291 containerd[1635]: time="2025-09-12T17:57:19.679087665Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.402307396s" Sep 12 17:57:19.679316 containerd[1635]: time="2025-09-12T17:57:19.679294333Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 17:57:19.686275 containerd[1635]: time="2025-09-12T17:57:19.686253803Z" level=info msg="CreateContainer within sandbox \"c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:57:19.690544 containerd[1635]: time="2025-09-12T17:57:19.690488307Z" level=info msg="Container d873099a1fef8d2a16ffcfd6a98d291e8b565e3c714d6949a712bd5bffed9f39: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:19.692171 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount630011185.mount: Deactivated successfully. Sep 12 17:57:19.694883 containerd[1635]: time="2025-09-12T17:57:19.694827783Z" level=info msg="CreateContainer within sandbox \"c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d873099a1fef8d2a16ffcfd6a98d291e8b565e3c714d6949a712bd5bffed9f39\"" Sep 12 17:57:19.695214 containerd[1635]: time="2025-09-12T17:57:19.695203784Z" level=info msg="StartContainer for \"d873099a1fef8d2a16ffcfd6a98d291e8b565e3c714d6949a712bd5bffed9f39\"" Sep 12 17:57:19.696813 containerd[1635]: time="2025-09-12T17:57:19.696792809Z" level=info msg="connecting to shim d873099a1fef8d2a16ffcfd6a98d291e8b565e3c714d6949a712bd5bffed9f39" address="unix:///run/containerd/s/b2f6c2b6aeadfcb8f3e07f2b1a1c9d777e6efd1b9a84d8f34dfb5fd39aea9eb5" protocol=ttrpc version=3 Sep 12 17:57:19.716098 systemd[1]: Started cri-containerd-d873099a1fef8d2a16ffcfd6a98d291e8b565e3c714d6949a712bd5bffed9f39.scope - libcontainer container d873099a1fef8d2a16ffcfd6a98d291e8b565e3c714d6949a712bd5bffed9f39. Sep 12 17:57:19.748189 containerd[1635]: time="2025-09-12T17:57:19.748120797Z" level=info msg="StartContainer for \"d873099a1fef8d2a16ffcfd6a98d291e8b565e3c714d6949a712bd5bffed9f39\" returns successfully" Sep 12 17:57:19.749323 containerd[1635]: time="2025-09-12T17:57:19.749311894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:57:19.979139 systemd-networkd[1522]: cali7209bc223c3: Gained IPv6LL Sep 12 17:57:20.871403 containerd[1635]: time="2025-09-12T17:57:20.871214948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bbhqh,Uid:7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce,Namespace:kube-system,Attempt:0,}" Sep 12 17:57:20.873641 containerd[1635]: time="2025-09-12T17:57:20.873622418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wp8q7,Uid:2caee24f-516f-4681-8138-c9df56293371,Namespace:calico-system,Attempt:0,}" Sep 12 17:57:20.970418 systemd-networkd[1522]: cali88c5647bf46: Link UP Sep 12 17:57:20.970515 systemd-networkd[1522]: cali88c5647bf46: Gained carrier Sep 12 17:57:20.990676 containerd[1635]: 2025-09-12 17:57:20.909 [INFO][4378] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0 coredns-668d6bf9bc- kube-system 7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce 803 0 2025-09-12 17:56:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-bbhqh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali88c5647bf46 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Namespace="kube-system" Pod="coredns-668d6bf9bc-bbhqh" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bbhqh-" Sep 12 17:57:20.990676 containerd[1635]: 2025-09-12 17:57:20.909 [INFO][4378] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Namespace="kube-system" Pod="coredns-668d6bf9bc-bbhqh" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" Sep 12 17:57:20.990676 containerd[1635]: 2025-09-12 17:57:20.931 [INFO][4390] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" HandleID="k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Workload="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.931 [INFO][4390] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" HandleID="k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Workload="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5950), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-bbhqh", "timestamp":"2025-09-12 17:57:20.930991717 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.931 [INFO][4390] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.931 [INFO][4390] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.931 [INFO][4390] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.935 [INFO][4390] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" host="localhost" Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.940 [INFO][4390] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.944 [INFO][4390] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.945 [INFO][4390] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.946 [INFO][4390] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:20.997285 containerd[1635]: 2025-09-12 17:57:20.946 [INFO][4390] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" host="localhost" Sep 12 17:57:21.000343 containerd[1635]: 2025-09-12 17:57:20.947 [INFO][4390] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606 Sep 12 17:57:21.000343 containerd[1635]: 2025-09-12 17:57:20.953 [INFO][4390] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" host="localhost" Sep 12 17:57:21.000343 containerd[1635]: 2025-09-12 17:57:20.966 [INFO][4390] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" host="localhost" Sep 12 17:57:21.000343 containerd[1635]: 2025-09-12 17:57:20.966 [INFO][4390] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" host="localhost" Sep 12 17:57:21.000343 containerd[1635]: 2025-09-12 17:57:20.966 [INFO][4390] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:57:21.000343 containerd[1635]: 2025-09-12 17:57:20.966 [INFO][4390] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" HandleID="k8s-pod-network.c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Workload="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" Sep 12 17:57:21.000441 containerd[1635]: 2025-09-12 17:57:20.968 [INFO][4378] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Namespace="kube-system" Pod="coredns-668d6bf9bc-bbhqh" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-bbhqh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali88c5647bf46", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:21.000504 containerd[1635]: 2025-09-12 17:57:20.968 [INFO][4378] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Namespace="kube-system" Pod="coredns-668d6bf9bc-bbhqh" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" Sep 12 17:57:21.000504 containerd[1635]: 2025-09-12 17:57:20.968 [INFO][4378] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali88c5647bf46 ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Namespace="kube-system" Pod="coredns-668d6bf9bc-bbhqh" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" Sep 12 17:57:21.000504 containerd[1635]: 2025-09-12 17:57:20.970 [INFO][4378] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Namespace="kube-system" Pod="coredns-668d6bf9bc-bbhqh" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" Sep 12 17:57:21.009255 containerd[1635]: 2025-09-12 17:57:20.971 [INFO][4378] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Namespace="kube-system" Pod="coredns-668d6bf9bc-bbhqh" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606", Pod:"coredns-668d6bf9bc-bbhqh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali88c5647bf46", MAC:"d6:59:42:6d:b9:e4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:21.009255 containerd[1635]: 2025-09-12 17:57:20.988 [INFO][4378] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" Namespace="kube-system" Pod="coredns-668d6bf9bc-bbhqh" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bbhqh-eth0" Sep 12 17:57:21.073196 containerd[1635]: time="2025-09-12T17:57:21.073092670Z" level=info msg="connecting to shim c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606" address="unix:///run/containerd/s/9bc1f75fe5cc0e07b11810ea0de2385cef31ee410ea86ed1044e7d93105f4d16" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:57:21.088642 systemd-networkd[1522]: cali29ab1fc5892: Link UP Sep 12 17:57:21.090075 systemd-networkd[1522]: cali29ab1fc5892: Gained carrier Sep 12 17:57:21.102190 systemd[1]: Started cri-containerd-c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606.scope - libcontainer container c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606. Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.000 [INFO][4397] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--wp8q7-eth0 goldmane-54d579b49d- calico-system 2caee24f-516f-4681-8138-c9df56293371 799 0 2025-09-12 17:56:54 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-wp8q7 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali29ab1fc5892 [] [] }} ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Namespace="calico-system" Pod="goldmane-54d579b49d-wp8q7" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wp8q7-" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.000 [INFO][4397] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Namespace="calico-system" Pod="goldmane-54d579b49d-wp8q7" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.045 [INFO][4414] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" HandleID="k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Workload="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.045 [INFO][4414] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" HandleID="k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Workload="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cdb40), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-wp8q7", "timestamp":"2025-09-12 17:57:21.045467436 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.045 [INFO][4414] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.045 [INFO][4414] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.045 [INFO][4414] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.050 [INFO][4414] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.053 [INFO][4414] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.056 [INFO][4414] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.057 [INFO][4414] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.059 [INFO][4414] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.059 [INFO][4414] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.060 [INFO][4414] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393 Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.073 [INFO][4414] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.080 [INFO][4414] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.080 [INFO][4414] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" host="localhost" Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.080 [INFO][4414] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:57:21.114694 containerd[1635]: 2025-09-12 17:57:21.080 [INFO][4414] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" HandleID="k8s-pod-network.5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Workload="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" Sep 12 17:57:21.115345 containerd[1635]: 2025-09-12 17:57:21.083 [INFO][4397] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Namespace="calico-system" Pod="goldmane-54d579b49d-wp8q7" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--wp8q7-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"2caee24f-516f-4681-8138-c9df56293371", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-wp8q7", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali29ab1fc5892", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:21.115345 containerd[1635]: 2025-09-12 17:57:21.083 [INFO][4397] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Namespace="calico-system" Pod="goldmane-54d579b49d-wp8q7" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" Sep 12 17:57:21.115345 containerd[1635]: 2025-09-12 17:57:21.083 [INFO][4397] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali29ab1fc5892 ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Namespace="calico-system" Pod="goldmane-54d579b49d-wp8q7" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" Sep 12 17:57:21.115345 containerd[1635]: 2025-09-12 17:57:21.090 [INFO][4397] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Namespace="calico-system" Pod="goldmane-54d579b49d-wp8q7" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" Sep 12 17:57:21.115345 containerd[1635]: 2025-09-12 17:57:21.090 [INFO][4397] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Namespace="calico-system" Pod="goldmane-54d579b49d-wp8q7" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--wp8q7-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"2caee24f-516f-4681-8138-c9df56293371", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393", Pod:"goldmane-54d579b49d-wp8q7", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali29ab1fc5892", MAC:"ee:18:fd:d6:8d:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:21.115345 containerd[1635]: 2025-09-12 17:57:21.111 [INFO][4397] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" Namespace="calico-system" Pod="goldmane-54d579b49d-wp8q7" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wp8q7-eth0" Sep 12 17:57:21.124706 systemd-resolved[1523]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 17:57:21.188781 containerd[1635]: time="2025-09-12T17:57:21.188719274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bbhqh,Uid:7d59a3f5-a1b8-4eb0-8ce5-1894e40ce7ce,Namespace:kube-system,Attempt:0,} returns sandbox id \"c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606\"" Sep 12 17:57:21.196982 containerd[1635]: time="2025-09-12T17:57:21.196956699Z" level=info msg="CreateContainer within sandbox \"c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:57:21.239686 containerd[1635]: time="2025-09-12T17:57:21.239657184Z" level=info msg="connecting to shim 5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393" address="unix:///run/containerd/s/8b112b8916c56bbcfb890826377a61c719ef0c895a378823f249cc810b90881b" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:57:21.260174 systemd[1]: Started cri-containerd-5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393.scope - libcontainer container 5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393. Sep 12 17:57:21.271747 systemd-resolved[1523]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 17:57:21.366623 containerd[1635]: time="2025-09-12T17:57:21.366595411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wp8q7,Uid:2caee24f-516f-4681-8138-c9df56293371,Namespace:calico-system,Attempt:0,} returns sandbox id \"5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393\"" Sep 12 17:57:21.496926 containerd[1635]: time="2025-09-12T17:57:21.496846064Z" level=info msg="Container 7dcad22c6ddddeeaccf4cb03621172282a14c91c44bfcbab6708dcc5667753ac: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:21.499669 containerd[1635]: time="2025-09-12T17:57:21.499648623Z" level=info msg="CreateContainer within sandbox \"c02a53ec54bdaf9071ae7c8b9b9d1a5ec0f3641d89e0455703d6e071e0710606\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7dcad22c6ddddeeaccf4cb03621172282a14c91c44bfcbab6708dcc5667753ac\"" Sep 12 17:57:21.501776 containerd[1635]: time="2025-09-12T17:57:21.501714807Z" level=info msg="StartContainer for \"7dcad22c6ddddeeaccf4cb03621172282a14c91c44bfcbab6708dcc5667753ac\"" Sep 12 17:57:21.503353 containerd[1635]: time="2025-09-12T17:57:21.503321144Z" level=info msg="connecting to shim 7dcad22c6ddddeeaccf4cb03621172282a14c91c44bfcbab6708dcc5667753ac" address="unix:///run/containerd/s/9bc1f75fe5cc0e07b11810ea0de2385cef31ee410ea86ed1044e7d93105f4d16" protocol=ttrpc version=3 Sep 12 17:57:21.517123 systemd[1]: Started cri-containerd-7dcad22c6ddddeeaccf4cb03621172282a14c91c44bfcbab6708dcc5667753ac.scope - libcontainer container 7dcad22c6ddddeeaccf4cb03621172282a14c91c44bfcbab6708dcc5667753ac. Sep 12 17:57:21.549174 containerd[1635]: time="2025-09-12T17:57:21.549146614Z" level=info msg="StartContainer for \"7dcad22c6ddddeeaccf4cb03621172282a14c91c44bfcbab6708dcc5667753ac\" returns successfully" Sep 12 17:57:21.870829 containerd[1635]: time="2025-09-12T17:57:21.870627705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cb648869-dcmss,Uid:6ffb3411-9b86-4776-866a-3e868e7a8ec5,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:57:21.895874 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4151715240.mount: Deactivated successfully. Sep 12 17:57:21.937118 containerd[1635]: time="2025-09-12T17:57:21.937065114Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:21.949155 containerd[1635]: time="2025-09-12T17:57:21.949124455Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 17:57:21.974182 containerd[1635]: time="2025-09-12T17:57:21.974099374Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:21.990360 systemd-networkd[1522]: cali79a4a003f4c: Link UP Sep 12 17:57:21.991069 systemd-networkd[1522]: cali79a4a003f4c: Gained carrier Sep 12 17:57:22.007303 containerd[1635]: time="2025-09-12T17:57:22.007151088Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:22.013619 containerd[1635]: time="2025-09-12T17:57:22.007997317Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.258573674s" Sep 12 17:57:22.013619 containerd[1635]: time="2025-09-12T17:57:22.008043401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 17:57:22.013619 containerd[1635]: time="2025-09-12T17:57:22.010827238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:57:22.013619 containerd[1635]: time="2025-09-12T17:57:22.010948029Z" level=info msg="CreateContainer within sandbox \"c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.910 [INFO][4563] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0 calico-apiserver-86cb648869- calico-apiserver 6ffb3411-9b86-4776-866a-3e868e7a8ec5 804 0 2025-09-12 17:56:52 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:86cb648869 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-86cb648869-dcmss eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali79a4a003f4c [] [] }} ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-dcmss" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--dcmss-" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.910 [INFO][4563] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-dcmss" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.955 [INFO][4581] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" HandleID="k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Workload="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.955 [INFO][4581] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" HandleID="k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Workload="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-86cb648869-dcmss", "timestamp":"2025-09-12 17:57:21.955162517 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.955 [INFO][4581] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.955 [INFO][4581] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.955 [INFO][4581] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.959 [INFO][4581] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.964 [INFO][4581] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.966 [INFO][4581] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.967 [INFO][4581] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.969 [INFO][4581] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.969 [INFO][4581] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.971 [INFO][4581] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.975 [INFO][4581] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.985 [INFO][4581] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.985 [INFO][4581] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" host="localhost" Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.985 [INFO][4581] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:57:22.114943 containerd[1635]: 2025-09-12 17:57:21.985 [INFO][4581] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" HandleID="k8s-pod-network.d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Workload="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" Sep 12 17:57:22.130304 containerd[1635]: 2025-09-12 17:57:21.987 [INFO][4563] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-dcmss" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0", GenerateName:"calico-apiserver-86cb648869-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ffb3411-9b86-4776-866a-3e868e7a8ec5", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86cb648869", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-86cb648869-dcmss", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali79a4a003f4c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:22.130304 containerd[1635]: 2025-09-12 17:57:21.987 [INFO][4563] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-dcmss" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" Sep 12 17:57:22.130304 containerd[1635]: 2025-09-12 17:57:21.987 [INFO][4563] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali79a4a003f4c ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-dcmss" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" Sep 12 17:57:22.130304 containerd[1635]: 2025-09-12 17:57:21.991 [INFO][4563] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-dcmss" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" Sep 12 17:57:22.130304 containerd[1635]: 2025-09-12 17:57:21.991 [INFO][4563] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-dcmss" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0", GenerateName:"calico-apiserver-86cb648869-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ffb3411-9b86-4776-866a-3e868e7a8ec5", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86cb648869", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c", Pod:"calico-apiserver-86cb648869-dcmss", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali79a4a003f4c", MAC:"da:ff:49:c6:38:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:22.130304 containerd[1635]: 2025-09-12 17:57:22.112 [INFO][4563] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-dcmss" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--dcmss-eth0" Sep 12 17:57:22.130304 containerd[1635]: time="2025-09-12T17:57:22.116507025Z" level=info msg="Container 315ce79620ea370d2a2b071d19f2a49e7975253d053a03b6c03a8c4c0ff2b110: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:22.218191 containerd[1635]: time="2025-09-12T17:57:22.218159138Z" level=info msg="CreateContainer within sandbox \"c9f437774323192751b789324ddf0d712051047dd348f376c4c452e882febb2f\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"315ce79620ea370d2a2b071d19f2a49e7975253d053a03b6c03a8c4c0ff2b110\"" Sep 12 17:57:22.226343 containerd[1635]: time="2025-09-12T17:57:22.224941680Z" level=info msg="StartContainer for \"315ce79620ea370d2a2b071d19f2a49e7975253d053a03b6c03a8c4c0ff2b110\"" Sep 12 17:57:22.226343 containerd[1635]: time="2025-09-12T17:57:22.225669429Z" level=info msg="connecting to shim 315ce79620ea370d2a2b071d19f2a49e7975253d053a03b6c03a8c4c0ff2b110" address="unix:///run/containerd/s/b2f6c2b6aeadfcb8f3e07f2b1a1c9d777e6efd1b9a84d8f34dfb5fd39aea9eb5" protocol=ttrpc version=3 Sep 12 17:57:22.248582 containerd[1635]: time="2025-09-12T17:57:22.248424014Z" level=info msg="connecting to shim d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c" address="unix:///run/containerd/s/d051d6a96d94d12e2e4b13ce989ea30088f652cf7e467d8367b083299e4b3e82" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:57:22.265031 kubelet[2937]: I0912 17:57:22.262172 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-bbhqh" podStartSLOduration=38.256286477 podStartE2EDuration="38.256286477s" podCreationTimestamp="2025-09-12 17:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:57:22.240397902 +0000 UTC m=+44.511532249" watchObservedRunningTime="2025-09-12 17:57:22.256286477 +0000 UTC m=+44.527420828" Sep 12 17:57:22.273782 systemd[1]: Started cri-containerd-315ce79620ea370d2a2b071d19f2a49e7975253d053a03b6c03a8c4c0ff2b110.scope - libcontainer container 315ce79620ea370d2a2b071d19f2a49e7975253d053a03b6c03a8c4c0ff2b110. Sep 12 17:57:22.290134 systemd[1]: Started cri-containerd-d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c.scope - libcontainer container d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c. Sep 12 17:57:22.308724 systemd-resolved[1523]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 17:57:22.341544 containerd[1635]: time="2025-09-12T17:57:22.341459513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cb648869-dcmss,Uid:6ffb3411-9b86-4776-866a-3e868e7a8ec5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c\"" Sep 12 17:57:22.362325 containerd[1635]: time="2025-09-12T17:57:22.362250557Z" level=info msg="StartContainer for \"315ce79620ea370d2a2b071d19f2a49e7975253d053a03b6c03a8c4c0ff2b110\" returns successfully" Sep 12 17:57:22.731198 systemd-networkd[1522]: cali88c5647bf46: Gained IPv6LL Sep 12 17:57:22.795193 systemd-networkd[1522]: cali29ab1fc5892: Gained IPv6LL Sep 12 17:57:22.871711 containerd[1635]: time="2025-09-12T17:57:22.871150195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cb648869-l49nt,Uid:393fa2be-7e95-4bab-a23d-441d0fb0527b,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:57:22.871791 containerd[1635]: time="2025-09-12T17:57:22.871761118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-d6bfv,Uid:fe63115b-e1de-47a4-b916-82af8ab911b5,Namespace:kube-system,Attempt:0,}" Sep 12 17:57:22.986288 systemd-networkd[1522]: calic0a7dc93606: Link UP Sep 12 17:57:22.988066 systemd-networkd[1522]: calic0a7dc93606: Gained carrier Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.911 [INFO][4681] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0 calico-apiserver-86cb648869- calico-apiserver 393fa2be-7e95-4bab-a23d-441d0fb0527b 801 0 2025-09-12 17:56:52 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:86cb648869 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-86cb648869-l49nt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic0a7dc93606 [] [] }} ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-l49nt" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--l49nt-" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.912 [INFO][4681] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-l49nt" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.949 [INFO][4705] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" HandleID="k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Workload="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.950 [INFO][4705] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" HandleID="k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Workload="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-86cb648869-l49nt", "timestamp":"2025-09-12 17:57:22.949854457 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.950 [INFO][4705] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.950 [INFO][4705] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.950 [INFO][4705] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.954 [INFO][4705] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.966 [INFO][4705] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.969 [INFO][4705] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.969 [INFO][4705] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.971 [INFO][4705] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.971 [INFO][4705] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.973 [INFO][4705] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.976 [INFO][4705] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.979 [INFO][4705] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.979 [INFO][4705] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" host="localhost" Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.979 [INFO][4705] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:57:22.995496 containerd[1635]: 2025-09-12 17:57:22.979 [INFO][4705] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" HandleID="k8s-pod-network.0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Workload="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" Sep 12 17:57:23.002080 containerd[1635]: 2025-09-12 17:57:22.982 [INFO][4681] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-l49nt" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0", GenerateName:"calico-apiserver-86cb648869-", Namespace:"calico-apiserver", SelfLink:"", UID:"393fa2be-7e95-4bab-a23d-441d0fb0527b", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86cb648869", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-86cb648869-l49nt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic0a7dc93606", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:23.002080 containerd[1635]: 2025-09-12 17:57:22.982 [INFO][4681] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-l49nt" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" Sep 12 17:57:23.002080 containerd[1635]: 2025-09-12 17:57:22.982 [INFO][4681] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic0a7dc93606 ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-l49nt" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" Sep 12 17:57:23.002080 containerd[1635]: 2025-09-12 17:57:22.986 [INFO][4681] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-l49nt" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" Sep 12 17:57:23.002080 containerd[1635]: 2025-09-12 17:57:22.986 [INFO][4681] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-l49nt" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0", GenerateName:"calico-apiserver-86cb648869-", Namespace:"calico-apiserver", SelfLink:"", UID:"393fa2be-7e95-4bab-a23d-441d0fb0527b", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86cb648869", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff", Pod:"calico-apiserver-86cb648869-l49nt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic0a7dc93606", MAC:"e6:52:06:4b:ec:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:23.002080 containerd[1635]: 2025-09-12 17:57:22.992 [INFO][4681] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" Namespace="calico-apiserver" Pod="calico-apiserver-86cb648869-l49nt" WorkloadEndpoint="localhost-k8s-calico--apiserver--86cb648869--l49nt-eth0" Sep 12 17:57:23.025940 containerd[1635]: time="2025-09-12T17:57:23.025891215Z" level=info msg="connecting to shim 0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff" address="unix:///run/containerd/s/e18fa63fd750a42922e5e6dfb682ac434a306b111f6ab94d22154edd55761851" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:57:23.052100 systemd[1]: Started cri-containerd-0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff.scope - libcontainer container 0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff. Sep 12 17:57:23.063166 systemd-resolved[1523]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 17:57:23.091341 containerd[1635]: time="2025-09-12T17:57:23.091264706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cb648869-l49nt,Uid:393fa2be-7e95-4bab-a23d-441d0fb0527b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff\"" Sep 12 17:57:23.105003 systemd-networkd[1522]: cali5e10978e271: Link UP Sep 12 17:57:23.105649 systemd-networkd[1522]: cali5e10978e271: Gained carrier Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:22.923 [INFO][4682] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0 coredns-668d6bf9bc- kube-system fe63115b-e1de-47a4-b916-82af8ab911b5 802 0 2025-09-12 17:56:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-d6bfv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5e10978e271 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-d6bfv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d6bfv-" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:22.923 [INFO][4682] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-d6bfv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:22.950 [INFO][4707] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" HandleID="k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Workload="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:22.950 [INFO][4707] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" HandleID="k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Workload="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf020), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-d6bfv", "timestamp":"2025-09-12 17:57:22.950057397 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:22.950 [INFO][4707] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:22.979 [INFO][4707] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:22.979 [INFO][4707] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.056 [INFO][4707] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.074 [INFO][4707] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.086 [INFO][4707] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.088 [INFO][4707] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.092 [INFO][4707] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.092 [INFO][4707] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.094 [INFO][4707] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.097 [INFO][4707] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.101 [INFO][4707] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.101 [INFO][4707] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" host="localhost" Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.101 [INFO][4707] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:57:23.117387 containerd[1635]: 2025-09-12 17:57:23.101 [INFO][4707] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" HandleID="k8s-pod-network.9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Workload="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" Sep 12 17:57:23.118446 containerd[1635]: 2025-09-12 17:57:23.102 [INFO][4682] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-d6bfv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"fe63115b-e1de-47a4-b916-82af8ab911b5", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-d6bfv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5e10978e271", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:23.118446 containerd[1635]: 2025-09-12 17:57:23.102 [INFO][4682] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-d6bfv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" Sep 12 17:57:23.118446 containerd[1635]: 2025-09-12 17:57:23.102 [INFO][4682] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5e10978e271 ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-d6bfv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" Sep 12 17:57:23.118446 containerd[1635]: 2025-09-12 17:57:23.105 [INFO][4682] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-d6bfv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" Sep 12 17:57:23.118446 containerd[1635]: 2025-09-12 17:57:23.106 [INFO][4682] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-d6bfv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"fe63115b-e1de-47a4-b916-82af8ab911b5", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd", Pod:"coredns-668d6bf9bc-d6bfv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5e10978e271", MAC:"f6:31:fa:36:ed:c4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:23.118446 containerd[1635]: 2025-09-12 17:57:23.115 [INFO][4682] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-d6bfv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d6bfv-eth0" Sep 12 17:57:23.130426 containerd[1635]: time="2025-09-12T17:57:23.130380923Z" level=info msg="connecting to shim 9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd" address="unix:///run/containerd/s/180ed2890e4c536752e7424528179bffe7a16990df416577eb482e0c12d98aec" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:57:23.152118 systemd[1]: Started cri-containerd-9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd.scope - libcontainer container 9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd. Sep 12 17:57:23.165726 systemd-resolved[1523]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 17:57:23.198071 containerd[1635]: time="2025-09-12T17:57:23.196300145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-d6bfv,Uid:fe63115b-e1de-47a4-b916-82af8ab911b5,Namespace:kube-system,Attempt:0,} returns sandbox id \"9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd\"" Sep 12 17:57:23.199559 containerd[1635]: time="2025-09-12T17:57:23.199540280Z" level=info msg="CreateContainer within sandbox \"9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:57:23.201140 kubelet[2937]: I0912 17:57:23.201108 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7558f9cdf6-hpjpl" podStartSLOduration=2.459051768 podStartE2EDuration="6.201095111s" podCreationTimestamp="2025-09-12 17:57:17 +0000 UTC" firstStartedPulling="2025-09-12 17:57:18.266434517 +0000 UTC m=+40.537568860" lastFinishedPulling="2025-09-12 17:57:22.008477857 +0000 UTC m=+44.279612203" observedRunningTime="2025-09-12 17:57:23.200568753 +0000 UTC m=+45.471703098" watchObservedRunningTime="2025-09-12 17:57:23.201095111 +0000 UTC m=+45.472229451" Sep 12 17:57:23.213492 containerd[1635]: time="2025-09-12T17:57:23.213464221Z" level=info msg="Container 95d591d52fc644cc2768626e520d3a8e8343664a462374445b7e0abedd0ac7dc: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:23.215720 containerd[1635]: time="2025-09-12T17:57:23.215701555Z" level=info msg="CreateContainer within sandbox \"9809e3317351cd88018b9e6534087c6a71b9ab6bcd3798a49bb07d19b3fc56cd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"95d591d52fc644cc2768626e520d3a8e8343664a462374445b7e0abedd0ac7dc\"" Sep 12 17:57:23.216377 containerd[1635]: time="2025-09-12T17:57:23.216269705Z" level=info msg="StartContainer for \"95d591d52fc644cc2768626e520d3a8e8343664a462374445b7e0abedd0ac7dc\"" Sep 12 17:57:23.216970 containerd[1635]: time="2025-09-12T17:57:23.216954127Z" level=info msg="connecting to shim 95d591d52fc644cc2768626e520d3a8e8343664a462374445b7e0abedd0ac7dc" address="unix:///run/containerd/s/180ed2890e4c536752e7424528179bffe7a16990df416577eb482e0c12d98aec" protocol=ttrpc version=3 Sep 12 17:57:23.233251 systemd[1]: Started cri-containerd-95d591d52fc644cc2768626e520d3a8e8343664a462374445b7e0abedd0ac7dc.scope - libcontainer container 95d591d52fc644cc2768626e520d3a8e8343664a462374445b7e0abedd0ac7dc. Sep 12 17:57:23.257841 containerd[1635]: time="2025-09-12T17:57:23.257731140Z" level=info msg="StartContainer for \"95d591d52fc644cc2768626e520d3a8e8343664a462374445b7e0abedd0ac7dc\" returns successfully" Sep 12 17:57:23.565261 systemd-networkd[1522]: cali79a4a003f4c: Gained IPv6LL Sep 12 17:57:24.139390 systemd-networkd[1522]: calic0a7dc93606: Gained IPv6LL Sep 12 17:57:24.193106 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount189371980.mount: Deactivated successfully. Sep 12 17:57:24.203024 kubelet[2937]: I0912 17:57:24.202972 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-d6bfv" podStartSLOduration=40.202959401 podStartE2EDuration="40.202959401s" podCreationTimestamp="2025-09-12 17:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:57:24.20271083 +0000 UTC m=+46.473845182" watchObservedRunningTime="2025-09-12 17:57:24.202959401 +0000 UTC m=+46.474093753" Sep 12 17:57:24.524172 systemd-networkd[1522]: cali5e10978e271: Gained IPv6LL Sep 12 17:57:24.769840 containerd[1635]: time="2025-09-12T17:57:24.769805310Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:24.900794 containerd[1635]: time="2025-09-12T17:57:24.777912933Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 17:57:24.922489 containerd[1635]: time="2025-09-12T17:57:24.921925821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wgskv,Uid:990867cb-2d26-443c-9869-c2147978654b,Namespace:calico-system,Attempt:0,}" Sep 12 17:57:24.924621 containerd[1635]: time="2025-09-12T17:57:24.924602784Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65fb48f58b-tnhtz,Uid:4c28333a-ba24-4808-ac79-d81485d8d6a4,Namespace:calico-system,Attempt:0,}" Sep 12 17:57:24.954671 containerd[1635]: time="2025-09-12T17:57:24.954014541Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:25.036686 containerd[1635]: time="2025-09-12T17:57:25.036655835Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:25.037170 containerd[1635]: time="2025-09-12T17:57:25.037153482Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.026307116s" Sep 12 17:57:25.037210 containerd[1635]: time="2025-09-12T17:57:25.037173429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 17:57:25.037852 containerd[1635]: time="2025-09-12T17:57:25.037834315Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:57:25.049138 containerd[1635]: time="2025-09-12T17:57:25.049114166Z" level=info msg="CreateContainer within sandbox \"5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:57:25.096381 containerd[1635]: time="2025-09-12T17:57:25.096350973Z" level=info msg="Container 9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:25.109917 containerd[1635]: time="2025-09-12T17:57:25.109687181Z" level=info msg="CreateContainer within sandbox \"5fa2503c63fffade32615e50feaf4e0fbbdd5b2d61934ef5077efe19f930f393\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\"" Sep 12 17:57:25.110999 containerd[1635]: time="2025-09-12T17:57:25.110983401Z" level=info msg="StartContainer for \"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\"" Sep 12 17:57:25.113979 containerd[1635]: time="2025-09-12T17:57:25.112315103Z" level=info msg="connecting to shim 9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3" address="unix:///run/containerd/s/8b112b8916c56bbcfb890826377a61c719ef0c895a378823f249cc810b90881b" protocol=ttrpc version=3 Sep 12 17:57:25.140209 systemd[1]: Started cri-containerd-9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3.scope - libcontainer container 9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3. Sep 12 17:57:25.249981 systemd-networkd[1522]: cali97303ef5bd8: Link UP Sep 12 17:57:25.256922 systemd-networkd[1522]: cali97303ef5bd8: Gained carrier Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.158 [INFO][4887] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0 calico-kube-controllers-65fb48f58b- calico-system 4c28333a-ba24-4808-ac79-d81485d8d6a4 798 0 2025-09-12 17:56:55 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:65fb48f58b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-65fb48f58b-tnhtz eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali97303ef5bd8 [] [] }} ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Namespace="calico-system" Pod="calico-kube-controllers-65fb48f58b-tnhtz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.159 [INFO][4887] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Namespace="calico-system" Pod="calico-kube-controllers-65fb48f58b-tnhtz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.199 [INFO][4931] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" HandleID="k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Workload="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.199 [INFO][4931] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" HandleID="k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Workload="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd660), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-65fb48f58b-tnhtz", "timestamp":"2025-09-12 17:57:25.199169471 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.199 [INFO][4931] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.199 [INFO][4931] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.199 [INFO][4931] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.205 [INFO][4931] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.212 [INFO][4931] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.216 [INFO][4931] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.218 [INFO][4931] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.220 [INFO][4931] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.220 [INFO][4931] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.221 [INFO][4931] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7 Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.225 [INFO][4931] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.231 [INFO][4931] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.231 [INFO][4931] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" host="localhost" Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.232 [INFO][4931] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:57:25.287387 containerd[1635]: 2025-09-12 17:57:25.232 [INFO][4931] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" HandleID="k8s-pod-network.5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Workload="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" Sep 12 17:57:25.307526 containerd[1635]: 2025-09-12 17:57:25.241 [INFO][4887] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Namespace="calico-system" Pod="calico-kube-controllers-65fb48f58b-tnhtz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0", GenerateName:"calico-kube-controllers-65fb48f58b-", Namespace:"calico-system", SelfLink:"", UID:"4c28333a-ba24-4808-ac79-d81485d8d6a4", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"65fb48f58b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-65fb48f58b-tnhtz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali97303ef5bd8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:25.307526 containerd[1635]: 2025-09-12 17:57:25.241 [INFO][4887] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Namespace="calico-system" Pod="calico-kube-controllers-65fb48f58b-tnhtz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" Sep 12 17:57:25.307526 containerd[1635]: 2025-09-12 17:57:25.241 [INFO][4887] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali97303ef5bd8 ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Namespace="calico-system" Pod="calico-kube-controllers-65fb48f58b-tnhtz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" Sep 12 17:57:25.307526 containerd[1635]: 2025-09-12 17:57:25.258 [INFO][4887] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Namespace="calico-system" Pod="calico-kube-controllers-65fb48f58b-tnhtz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" Sep 12 17:57:25.307526 containerd[1635]: 2025-09-12 17:57:25.259 [INFO][4887] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Namespace="calico-system" Pod="calico-kube-controllers-65fb48f58b-tnhtz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0", GenerateName:"calico-kube-controllers-65fb48f58b-", Namespace:"calico-system", SelfLink:"", UID:"4c28333a-ba24-4808-ac79-d81485d8d6a4", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"65fb48f58b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7", Pod:"calico-kube-controllers-65fb48f58b-tnhtz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali97303ef5bd8", MAC:"0a:9e:2e:13:55:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:25.307526 containerd[1635]: 2025-09-12 17:57:25.276 [INFO][4887] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" Namespace="calico-system" Pod="calico-kube-controllers-65fb48f58b-tnhtz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65fb48f58b--tnhtz-eth0" Sep 12 17:57:25.307526 containerd[1635]: time="2025-09-12T17:57:25.305495098Z" level=info msg="StartContainer for \"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" returns successfully" Sep 12 17:57:25.424858 systemd-networkd[1522]: califaec800a0de: Link UP Sep 12 17:57:25.425454 systemd-networkd[1522]: califaec800a0de: Gained carrier Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.161 [INFO][4884] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--wgskv-eth0 csi-node-driver- calico-system 990867cb-2d26-443c-9869-c2147978654b 669 0 2025-09-12 17:56:54 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-wgskv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] califaec800a0de [] [] }} ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Namespace="calico-system" Pod="csi-node-driver-wgskv" WorkloadEndpoint="localhost-k8s-csi--node--driver--wgskv-" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.162 [INFO][4884] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Namespace="calico-system" Pod="csi-node-driver-wgskv" WorkloadEndpoint="localhost-k8s-csi--node--driver--wgskv-eth0" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.199 [INFO][4937] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" HandleID="k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Workload="localhost-k8s-csi--node--driver--wgskv-eth0" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.199 [INFO][4937] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" HandleID="k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Workload="localhost-k8s-csi--node--driver--wgskv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f610), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-wgskv", "timestamp":"2025-09-12 17:57:25.196594135 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.199 [INFO][4937] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.231 [INFO][4937] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.231 [INFO][4937] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.308 [INFO][4937] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.316 [INFO][4937] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.320 [INFO][4937] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.321 [INFO][4937] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.323 [INFO][4937] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.323 [INFO][4937] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.325 [INFO][4937] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4 Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.336 [INFO][4937] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.420 [INFO][4937] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.420 [INFO][4937] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" host="localhost" Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.420 [INFO][4937] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:57:25.464094 containerd[1635]: 2025-09-12 17:57:25.420 [INFO][4937] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" HandleID="k8s-pod-network.dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Workload="localhost-k8s-csi--node--driver--wgskv-eth0" Sep 12 17:57:25.473189 containerd[1635]: 2025-09-12 17:57:25.422 [INFO][4884] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Namespace="calico-system" Pod="csi-node-driver-wgskv" WorkloadEndpoint="localhost-k8s-csi--node--driver--wgskv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--wgskv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"990867cb-2d26-443c-9869-c2147978654b", ResourceVersion:"669", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-wgskv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califaec800a0de", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:25.473189 containerd[1635]: 2025-09-12 17:57:25.422 [INFO][4884] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Namespace="calico-system" Pod="csi-node-driver-wgskv" WorkloadEndpoint="localhost-k8s-csi--node--driver--wgskv-eth0" Sep 12 17:57:25.473189 containerd[1635]: 2025-09-12 17:57:25.422 [INFO][4884] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califaec800a0de ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Namespace="calico-system" Pod="csi-node-driver-wgskv" WorkloadEndpoint="localhost-k8s-csi--node--driver--wgskv-eth0" Sep 12 17:57:25.473189 containerd[1635]: 2025-09-12 17:57:25.425 [INFO][4884] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Namespace="calico-system" Pod="csi-node-driver-wgskv" WorkloadEndpoint="localhost-k8s-csi--node--driver--wgskv-eth0" Sep 12 17:57:25.473189 containerd[1635]: 2025-09-12 17:57:25.426 [INFO][4884] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Namespace="calico-system" Pod="csi-node-driver-wgskv" WorkloadEndpoint="localhost-k8s-csi--node--driver--wgskv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--wgskv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"990867cb-2d26-443c-9869-c2147978654b", ResourceVersion:"669", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 56, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4", Pod:"csi-node-driver-wgskv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califaec800a0de", MAC:"e2:5d:37:2b:0a:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:57:25.473189 containerd[1635]: 2025-09-12 17:57:25.461 [INFO][4884] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" Namespace="calico-system" Pod="csi-node-driver-wgskv" WorkloadEndpoint="localhost-k8s-csi--node--driver--wgskv-eth0" Sep 12 17:57:25.538624 containerd[1635]: time="2025-09-12T17:57:25.538022269Z" level=info msg="connecting to shim dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4" address="unix:///run/containerd/s/d79f7ad41e3e9a60532fec1a8f652694cbcf46f189058f3e0eef60714e658fa5" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:57:25.541535 containerd[1635]: time="2025-09-12T17:57:25.541503143Z" level=info msg="connecting to shim 5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7" address="unix:///run/containerd/s/d19caa481dec65845d97a2711fd5da25d83a61366305f8d5cba56da7ca39dca1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:57:25.571144 systemd[1]: Started cri-containerd-5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7.scope - libcontainer container 5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7. Sep 12 17:57:25.586159 systemd[1]: Started cri-containerd-dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4.scope - libcontainer container dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4. Sep 12 17:57:25.602110 systemd-resolved[1523]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 17:57:25.631223 systemd-resolved[1523]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 17:57:25.649424 containerd[1635]: time="2025-09-12T17:57:25.649397704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wgskv,Uid:990867cb-2d26-443c-9869-c2147978654b,Namespace:calico-system,Attempt:0,} returns sandbox id \"dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4\"" Sep 12 17:57:25.680527 containerd[1635]: time="2025-09-12T17:57:25.680500085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65fb48f58b-tnhtz,Uid:4c28333a-ba24-4808-ac79-d81485d8d6a4,Namespace:calico-system,Attempt:0,} returns sandbox id \"5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7\"" Sep 12 17:57:26.349828 containerd[1635]: time="2025-09-12T17:57:26.349794752Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" id:\"269781dbf707a2a4a0e984de350e2e288b6028d51dbd478d47fc479457a2d420\" pid:5081 exit_status:1 exited_at:{seconds:1757699846 nanos:342191172}" Sep 12 17:57:26.507201 systemd-networkd[1522]: cali97303ef5bd8: Gained IPv6LL Sep 12 17:57:26.827144 systemd-networkd[1522]: califaec800a0de: Gained IPv6LL Sep 12 17:57:27.370798 containerd[1635]: time="2025-09-12T17:57:27.370761959Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" id:\"7346c9af46a4e52c2069843fdc1038a66a251cf9712ea33788f92b773c88cbd3\" pid:5103 exit_status:1 exited_at:{seconds:1757699847 nanos:370498329}" Sep 12 17:57:28.340796 containerd[1635]: time="2025-09-12T17:57:28.340760124Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" id:\"fa0a12da2ea06fe6020af9b4e3a3f87a130ad857da2ee3e07f05abbc597541dc\" pid:5132 exit_status:1 exited_at:{seconds:1757699848 nanos:339857470}" Sep 12 17:57:30.214751 containerd[1635]: time="2025-09-12T17:57:30.214716907Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:30.218635 containerd[1635]: time="2025-09-12T17:57:30.218610824Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 17:57:30.220890 containerd[1635]: time="2025-09-12T17:57:30.220853048Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:30.225614 containerd[1635]: time="2025-09-12T17:57:30.225573040Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:30.226472 containerd[1635]: time="2025-09-12T17:57:30.226226898Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 5.188373044s" Sep 12 17:57:30.226472 containerd[1635]: time="2025-09-12T17:57:30.226264600Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:57:30.227161 containerd[1635]: time="2025-09-12T17:57:30.227146828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:57:30.238898 containerd[1635]: time="2025-09-12T17:57:30.238840370Z" level=info msg="CreateContainer within sandbox \"d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:57:30.287315 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2210974319.mount: Deactivated successfully. Sep 12 17:57:30.292363 containerd[1635]: time="2025-09-12T17:57:30.287793763Z" level=info msg="Container e370261bfce4df03cb5818a2b4ed3a38e0dc1fe6cb9f7ad483aebe6c94ea2e7f: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:30.317439 containerd[1635]: time="2025-09-12T17:57:30.317341346Z" level=info msg="CreateContainer within sandbox \"d9695de0c97dd306ff853ae616844bdb31a2c9e62baf5b94352ad8830932ad2c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e370261bfce4df03cb5818a2b4ed3a38e0dc1fe6cb9f7ad483aebe6c94ea2e7f\"" Sep 12 17:57:30.317839 containerd[1635]: time="2025-09-12T17:57:30.317815511Z" level=info msg="StartContainer for \"e370261bfce4df03cb5818a2b4ed3a38e0dc1fe6cb9f7ad483aebe6c94ea2e7f\"" Sep 12 17:57:30.319721 containerd[1635]: time="2025-09-12T17:57:30.319675906Z" level=info msg="connecting to shim e370261bfce4df03cb5818a2b4ed3a38e0dc1fe6cb9f7ad483aebe6c94ea2e7f" address="unix:///run/containerd/s/d051d6a96d94d12e2e4b13ce989ea30088f652cf7e467d8367b083299e4b3e82" protocol=ttrpc version=3 Sep 12 17:57:30.341182 systemd[1]: Started cri-containerd-e370261bfce4df03cb5818a2b4ed3a38e0dc1fe6cb9f7ad483aebe6c94ea2e7f.scope - libcontainer container e370261bfce4df03cb5818a2b4ed3a38e0dc1fe6cb9f7ad483aebe6c94ea2e7f. Sep 12 17:57:30.575465 containerd[1635]: time="2025-09-12T17:57:30.574961184Z" level=info msg="StartContainer for \"e370261bfce4df03cb5818a2b4ed3a38e0dc1fe6cb9f7ad483aebe6c94ea2e7f\" returns successfully" Sep 12 17:57:30.901547 containerd[1635]: time="2025-09-12T17:57:30.901442848Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:30.903471 containerd[1635]: time="2025-09-12T17:57:30.903334104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:57:30.908414 containerd[1635]: time="2025-09-12T17:57:30.908351851Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 681.100247ms" Sep 12 17:57:30.908414 containerd[1635]: time="2025-09-12T17:57:30.908402891Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:57:30.910424 containerd[1635]: time="2025-09-12T17:57:30.910307781Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:57:30.917130 containerd[1635]: time="2025-09-12T17:57:30.915432757Z" level=info msg="CreateContainer within sandbox \"0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:57:30.929679 containerd[1635]: time="2025-09-12T17:57:30.929154067Z" level=info msg="Container 5707cba9492ef8f234e7d4bc09d2bb7e3f3bf84918895b85ac1118b0af97da2b: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:30.940030 containerd[1635]: time="2025-09-12T17:57:30.939968997Z" level=info msg="CreateContainer within sandbox \"0350f7355345e583b5ef973610e9cebef78836c9c8adc4c07d0b95fa5599f3ff\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5707cba9492ef8f234e7d4bc09d2bb7e3f3bf84918895b85ac1118b0af97da2b\"" Sep 12 17:57:30.943384 containerd[1635]: time="2025-09-12T17:57:30.943339718Z" level=info msg="StartContainer for \"5707cba9492ef8f234e7d4bc09d2bb7e3f3bf84918895b85ac1118b0af97da2b\"" Sep 12 17:57:30.944474 containerd[1635]: time="2025-09-12T17:57:30.944446620Z" level=info msg="connecting to shim 5707cba9492ef8f234e7d4bc09d2bb7e3f3bf84918895b85ac1118b0af97da2b" address="unix:///run/containerd/s/e18fa63fd750a42922e5e6dfb682ac434a306b111f6ab94d22154edd55761851" protocol=ttrpc version=3 Sep 12 17:57:30.985336 systemd[1]: Started cri-containerd-5707cba9492ef8f234e7d4bc09d2bb7e3f3bf84918895b85ac1118b0af97da2b.scope - libcontainer container 5707cba9492ef8f234e7d4bc09d2bb7e3f3bf84918895b85ac1118b0af97da2b. Sep 12 17:57:31.098307 containerd[1635]: time="2025-09-12T17:57:31.098213804Z" level=info msg="StartContainer for \"5707cba9492ef8f234e7d4bc09d2bb7e3f3bf84918895b85ac1118b0af97da2b\" returns successfully" Sep 12 17:57:31.367187 kubelet[2937]: I0912 17:57:31.367138 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-86cb648869-dcmss" podStartSLOduration=31.484538906 podStartE2EDuration="39.362423073s" podCreationTimestamp="2025-09-12 17:56:52 +0000 UTC" firstStartedPulling="2025-09-12 17:57:22.349048391 +0000 UTC m=+44.620182737" lastFinishedPulling="2025-09-12 17:57:30.226932549 +0000 UTC m=+52.498066904" observedRunningTime="2025-09-12 17:57:31.329402956 +0000 UTC m=+53.600537303" watchObservedRunningTime="2025-09-12 17:57:31.362423073 +0000 UTC m=+53.633557428" Sep 12 17:57:31.369068 kubelet[2937]: I0912 17:57:31.367235 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-wp8q7" podStartSLOduration=33.696843259 podStartE2EDuration="37.367229771s" podCreationTimestamp="2025-09-12 17:56:54 +0000 UTC" firstStartedPulling="2025-09-12 17:57:21.36734214 +0000 UTC m=+43.638476492" lastFinishedPulling="2025-09-12 17:57:25.037728661 +0000 UTC m=+47.308863004" observedRunningTime="2025-09-12 17:57:26.268231631 +0000 UTC m=+48.539365975" watchObservedRunningTime="2025-09-12 17:57:31.367229771 +0000 UTC m=+53.638364119" Sep 12 17:57:31.369068 kubelet[2937]: I0912 17:57:31.367579 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-86cb648869-l49nt" podStartSLOduration=31.549837293 podStartE2EDuration="39.367570328s" podCreationTimestamp="2025-09-12 17:56:52 +0000 UTC" firstStartedPulling="2025-09-12 17:57:23.092158599 +0000 UTC m=+45.363292941" lastFinishedPulling="2025-09-12 17:57:30.90989163 +0000 UTC m=+53.181025976" observedRunningTime="2025-09-12 17:57:31.355179025 +0000 UTC m=+53.626313372" watchObservedRunningTime="2025-09-12 17:57:31.367570328 +0000 UTC m=+53.638704675" Sep 12 17:57:32.355301 kubelet[2937]: I0912 17:57:32.355269 2937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:57:32.467934 containerd[1635]: time="2025-09-12T17:57:32.467856417Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:32.468717 containerd[1635]: time="2025-09-12T17:57:32.468700942Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 17:57:32.469544 containerd[1635]: time="2025-09-12T17:57:32.469510473Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:32.473024 containerd[1635]: time="2025-09-12T17:57:32.472318052Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.561628723s" Sep 12 17:57:32.473024 containerd[1635]: time="2025-09-12T17:57:32.472343854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 17:57:32.474888 containerd[1635]: time="2025-09-12T17:57:32.473535517Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:32.475385 containerd[1635]: time="2025-09-12T17:57:32.475367007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:57:32.485546 containerd[1635]: time="2025-09-12T17:57:32.485509521Z" level=info msg="CreateContainer within sandbox \"dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:57:32.532820 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1136421574.mount: Deactivated successfully. Sep 12 17:57:32.535585 containerd[1635]: time="2025-09-12T17:57:32.534871940Z" level=info msg="Container db21d10e6ab707583261ae90f9fd17474fb1888788a3cd2d3b567c3068ae96fd: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:32.545998 containerd[1635]: time="2025-09-12T17:57:32.545970066Z" level=info msg="CreateContainer within sandbox \"dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"db21d10e6ab707583261ae90f9fd17474fb1888788a3cd2d3b567c3068ae96fd\"" Sep 12 17:57:32.547439 containerd[1635]: time="2025-09-12T17:57:32.547287729Z" level=info msg="StartContainer for \"db21d10e6ab707583261ae90f9fd17474fb1888788a3cd2d3b567c3068ae96fd\"" Sep 12 17:57:32.549439 containerd[1635]: time="2025-09-12T17:57:32.549411275Z" level=info msg="connecting to shim db21d10e6ab707583261ae90f9fd17474fb1888788a3cd2d3b567c3068ae96fd" address="unix:///run/containerd/s/d79f7ad41e3e9a60532fec1a8f652694cbcf46f189058f3e0eef60714e658fa5" protocol=ttrpc version=3 Sep 12 17:57:32.594163 systemd[1]: Started cri-containerd-db21d10e6ab707583261ae90f9fd17474fb1888788a3cd2d3b567c3068ae96fd.scope - libcontainer container db21d10e6ab707583261ae90f9fd17474fb1888788a3cd2d3b567c3068ae96fd. Sep 12 17:57:32.726806 containerd[1635]: time="2025-09-12T17:57:32.726656397Z" level=info msg="StartContainer for \"db21d10e6ab707583261ae90f9fd17474fb1888788a3cd2d3b567c3068ae96fd\" returns successfully" Sep 12 17:57:39.476328 containerd[1635]: time="2025-09-12T17:57:39.476208613Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:39.512955 containerd[1635]: time="2025-09-12T17:57:39.512923327Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 17:57:39.578812 containerd[1635]: time="2025-09-12T17:57:39.578765337Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:39.583659 containerd[1635]: time="2025-09-12T17:57:39.583576398Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:39.583996 containerd[1635]: time="2025-09-12T17:57:39.583977012Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 7.108485815s" Sep 12 17:57:39.586529 containerd[1635]: time="2025-09-12T17:57:39.584078315Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 17:57:39.591697 containerd[1635]: time="2025-09-12T17:57:39.591671539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:57:39.902206 containerd[1635]: time="2025-09-12T17:57:39.902171481Z" level=info msg="CreateContainer within sandbox \"5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:57:39.932021 containerd[1635]: time="2025-09-12T17:57:39.931897143Z" level=info msg="Container 525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:39.956158 containerd[1635]: time="2025-09-12T17:57:39.956116405Z" level=info msg="CreateContainer within sandbox \"5042c1d2f388665c538ad305274fca4446cba8368c23fec3e194ace4134c96a7\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751\"" Sep 12 17:57:39.960924 containerd[1635]: time="2025-09-12T17:57:39.960891572Z" level=info msg="StartContainer for \"525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751\"" Sep 12 17:57:39.964029 containerd[1635]: time="2025-09-12T17:57:39.963972867Z" level=info msg="connecting to shim 525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751" address="unix:///run/containerd/s/d19caa481dec65845d97a2711fd5da25d83a61366305f8d5cba56da7ca39dca1" protocol=ttrpc version=3 Sep 12 17:57:40.334180 systemd[1]: Started cri-containerd-525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751.scope - libcontainer container 525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751. Sep 12 17:57:40.453131 containerd[1635]: time="2025-09-12T17:57:40.453075036Z" level=info msg="StartContainer for \"525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751\" returns successfully" Sep 12 17:57:41.527367 kubelet[2937]: I0912 17:57:41.524672 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-65fb48f58b-tnhtz" podStartSLOduration=32.598407789 podStartE2EDuration="46.508304659s" podCreationTimestamp="2025-09-12 17:56:55 +0000 UTC" firstStartedPulling="2025-09-12 17:57:25.681534083 +0000 UTC m=+47.952668425" lastFinishedPulling="2025-09-12 17:57:39.591430948 +0000 UTC m=+61.862565295" observedRunningTime="2025-09-12 17:57:41.506125954 +0000 UTC m=+63.777260300" watchObservedRunningTime="2025-09-12 17:57:41.508304659 +0000 UTC m=+63.779439006" Sep 12 17:57:41.552924 containerd[1635]: time="2025-09-12T17:57:41.552893809Z" level=info msg="TaskExit event in podsandbox handler container_id:\"525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751\" id:\"9e7eafb904a549983a85f2e97bee8b490b149fb85706f9d30c61ff78c63b985c\" pid:5335 exited_at:{seconds:1757699861 nanos:517785182}" Sep 12 17:57:41.765605 containerd[1635]: time="2025-09-12T17:57:41.765567147Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:41.766088 containerd[1635]: time="2025-09-12T17:57:41.766053599Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 17:57:41.767192 containerd[1635]: time="2025-09-12T17:57:41.766192799Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:41.768862 containerd[1635]: time="2025-09-12T17:57:41.768487965Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:57:41.769979 containerd[1635]: time="2025-09-12T17:57:41.769446861Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.177619904s" Sep 12 17:57:41.769979 containerd[1635]: time="2025-09-12T17:57:41.769471809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 17:57:42.471824 containerd[1635]: time="2025-09-12T17:57:42.471793913Z" level=info msg="CreateContainer within sandbox \"dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:57:42.475933 kubelet[2937]: I0912 17:57:42.475888 2937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:57:42.524276 containerd[1635]: time="2025-09-12T17:57:42.524241754Z" level=info msg="Container 7f30bd7a8131b5d987f2906202e7dfb002f38f8318190dd4bf9e335502114604: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:57:42.528294 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1003499966.mount: Deactivated successfully. Sep 12 17:57:42.614500 containerd[1635]: time="2025-09-12T17:57:42.614466099Z" level=info msg="CreateContainer within sandbox \"dd7d4881ad396d1744482d4b2b9bfa14a5e9dbd7d0182e21e143cdf561f717d4\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7f30bd7a8131b5d987f2906202e7dfb002f38f8318190dd4bf9e335502114604\"" Sep 12 17:57:42.623857 containerd[1635]: time="2025-09-12T17:57:42.623836474Z" level=info msg="StartContainer for \"7f30bd7a8131b5d987f2906202e7dfb002f38f8318190dd4bf9e335502114604\"" Sep 12 17:57:42.631546 containerd[1635]: time="2025-09-12T17:57:42.625041930Z" level=info msg="connecting to shim 7f30bd7a8131b5d987f2906202e7dfb002f38f8318190dd4bf9e335502114604" address="unix:///run/containerd/s/d79f7ad41e3e9a60532fec1a8f652694cbcf46f189058f3e0eef60714e658fa5" protocol=ttrpc version=3 Sep 12 17:57:42.651285 systemd[1]: Started cri-containerd-7f30bd7a8131b5d987f2906202e7dfb002f38f8318190dd4bf9e335502114604.scope - libcontainer container 7f30bd7a8131b5d987f2906202e7dfb002f38f8318190dd4bf9e335502114604. Sep 12 17:57:42.741192 containerd[1635]: time="2025-09-12T17:57:42.741042081Z" level=info msg="StartContainer for \"7f30bd7a8131b5d987f2906202e7dfb002f38f8318190dd4bf9e335502114604\" returns successfully" Sep 12 17:57:43.387190 kubelet[2937]: I0912 17:57:43.385980 2937 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:57:43.387522 kubelet[2937]: I0912 17:57:43.387203 2937 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:57:47.581564 containerd[1635]: time="2025-09-12T17:57:47.581477342Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" id:\"658e29eaba0b1323a077c22e47eb400b9d88a318c515d0082c7060b6e22a1764\" pid:5425 exited_at:{seconds:1757699867 nanos:581248062}" Sep 12 17:57:47.837530 containerd[1635]: time="2025-09-12T17:57:47.836723953Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda\" id:\"4d9f63e2379ba287552f60ac55c475ea997abccf23090f9080d142c63939a9fb\" pid:5405 exited_at:{seconds:1757699867 nanos:836481705}" Sep 12 17:57:47.869146 kubelet[2937]: I0912 17:57:47.869077 2937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-wgskv" podStartSLOduration=37.199960263 podStartE2EDuration="53.859540796s" podCreationTimestamp="2025-09-12 17:56:54 +0000 UTC" firstStartedPulling="2025-09-12 17:57:25.65065147 +0000 UTC m=+47.921785813" lastFinishedPulling="2025-09-12 17:57:42.310232006 +0000 UTC m=+64.581366346" observedRunningTime="2025-09-12 17:57:43.487507104 +0000 UTC m=+65.758641456" watchObservedRunningTime="2025-09-12 17:57:47.859540796 +0000 UTC m=+70.130675145" Sep 12 17:57:58.516896 containerd[1635]: time="2025-09-12T17:57:58.516864302Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" id:\"ad3a769750e95f94ab9af82665a71cdd348bfa4966f2d7074ac6b27397eeed55\" pid:5464 exited_at:{seconds:1757699878 nanos:516660105}" Sep 12 17:58:04.879423 systemd[1]: Started sshd@7-139.178.70.102:22-139.178.89.65:44530.service - OpenSSH per-connection server daemon (139.178.89.65:44530). Sep 12 17:58:05.060886 sshd[5494]: Accepted publickey for core from 139.178.89.65 port 44530 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:05.065958 sshd-session[5494]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:05.077392 systemd-logind[1603]: New session 10 of user core. Sep 12 17:58:05.082154 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:58:05.840055 sshd[5497]: Connection closed by 139.178.89.65 port 44530 Sep 12 17:58:05.841370 sshd-session[5494]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:05.852721 systemd[1]: sshd@7-139.178.70.102:22-139.178.89.65:44530.service: Deactivated successfully. Sep 12 17:58:05.856133 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:58:05.857486 systemd-logind[1603]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:58:05.859395 systemd-logind[1603]: Removed session 10. Sep 12 17:58:10.876678 systemd[1]: Started sshd@8-139.178.70.102:22-139.178.89.65:33860.service - OpenSSH per-connection server daemon (139.178.89.65:33860). Sep 12 17:58:11.366020 sshd[5520]: Accepted publickey for core from 139.178.89.65 port 33860 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:11.367047 sshd-session[5520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:11.372257 systemd-logind[1603]: New session 11 of user core. Sep 12 17:58:11.379269 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:58:11.667868 containerd[1635]: time="2025-09-12T17:58:11.667779169Z" level=info msg="TaskExit event in podsandbox handler container_id:\"525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751\" id:\"01e53965f54ff871d278bfa7ef48dd913045c940b16cdef2ff351be3c3ef61c2\" pid:5540 exited_at:{seconds:1757699891 nanos:653265511}" Sep 12 17:58:12.651314 sshd[5523]: Connection closed by 139.178.89.65 port 33860 Sep 12 17:58:12.655460 sshd-session[5520]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:12.660019 systemd[1]: sshd@8-139.178.70.102:22-139.178.89.65:33860.service: Deactivated successfully. Sep 12 17:58:12.661832 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:58:12.663106 systemd-logind[1603]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:58:12.666447 systemd-logind[1603]: Removed session 11. Sep 12 17:58:17.401764 containerd[1635]: time="2025-09-12T17:58:17.401719623Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda\" id:\"4feed5a04d4d43735eec26bf18dfa98a5fad3d393f2c8dd7e8a58c35eb8e83d7\" pid:5569 exited_at:{seconds:1757699897 nanos:401459069}" Sep 12 17:58:17.661968 systemd[1]: Started sshd@9-139.178.70.102:22-139.178.89.65:33872.service - OpenSSH per-connection server daemon (139.178.89.65:33872). Sep 12 17:58:17.763470 sshd[5582]: Accepted publickey for core from 139.178.89.65 port 33872 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:17.765747 sshd-session[5582]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:17.770273 systemd-logind[1603]: New session 12 of user core. Sep 12 17:58:17.775102 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:58:18.029045 sshd[5585]: Connection closed by 139.178.89.65 port 33872 Sep 12 17:58:18.029544 sshd-session[5582]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:18.038728 systemd[1]: sshd@9-139.178.70.102:22-139.178.89.65:33872.service: Deactivated successfully. Sep 12 17:58:18.039855 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:58:18.043374 systemd-logind[1603]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:58:18.047882 systemd[1]: Started sshd@10-139.178.70.102:22-139.178.89.65:33886.service - OpenSSH per-connection server daemon (139.178.89.65:33886). Sep 12 17:58:18.050465 systemd-logind[1603]: Removed session 12. Sep 12 17:58:18.094751 sshd[5598]: Accepted publickey for core from 139.178.89.65 port 33886 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:18.096679 sshd-session[5598]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:18.100215 systemd-logind[1603]: New session 13 of user core. Sep 12 17:58:18.106166 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:58:18.271297 sshd[5601]: Connection closed by 139.178.89.65 port 33886 Sep 12 17:58:18.271587 sshd-session[5598]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:18.280590 systemd[1]: sshd@10-139.178.70.102:22-139.178.89.65:33886.service: Deactivated successfully. Sep 12 17:58:18.282484 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:58:18.283992 systemd-logind[1603]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:58:18.289280 systemd[1]: Started sshd@11-139.178.70.102:22-139.178.89.65:33890.service - OpenSSH per-connection server daemon (139.178.89.65:33890). Sep 12 17:58:18.291258 systemd-logind[1603]: Removed session 13. Sep 12 17:58:18.382119 sshd[5611]: Accepted publickey for core from 139.178.89.65 port 33890 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:18.383061 sshd-session[5611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:18.385871 systemd-logind[1603]: New session 14 of user core. Sep 12 17:58:18.391197 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:58:18.527487 sshd[5618]: Connection closed by 139.178.89.65 port 33890 Sep 12 17:58:18.527869 sshd-session[5611]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:18.530773 systemd[1]: sshd@11-139.178.70.102:22-139.178.89.65:33890.service: Deactivated successfully. Sep 12 17:58:18.531185 systemd-logind[1603]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:58:18.533408 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:58:18.536908 systemd-logind[1603]: Removed session 14. Sep 12 17:58:23.541529 systemd[1]: Started sshd@12-139.178.70.102:22-139.178.89.65:38164.service - OpenSSH per-connection server daemon (139.178.89.65:38164). Sep 12 17:58:23.611706 sshd[5633]: Accepted publickey for core from 139.178.89.65 port 38164 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:23.612419 sshd-session[5633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:23.617044 systemd-logind[1603]: New session 15 of user core. Sep 12 17:58:23.622171 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:58:23.791900 sshd[5636]: Connection closed by 139.178.89.65 port 38164 Sep 12 17:58:23.792864 sshd-session[5633]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:23.800668 systemd[1]: sshd@12-139.178.70.102:22-139.178.89.65:38164.service: Deactivated successfully. Sep 12 17:58:23.803552 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:58:23.804367 systemd-logind[1603]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:58:23.805712 systemd-logind[1603]: Removed session 15. Sep 12 17:58:28.802216 systemd[1]: Started sshd@13-139.178.70.102:22-139.178.89.65:38180.service - OpenSSH per-connection server daemon (139.178.89.65:38180). Sep 12 17:58:29.073652 sshd[5668]: Accepted publickey for core from 139.178.89.65 port 38180 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:29.076462 sshd-session[5668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:29.083399 systemd-logind[1603]: New session 16 of user core. Sep 12 17:58:29.089294 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:58:29.344273 containerd[1635]: time="2025-09-12T17:58:29.344097741Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" id:\"6757f86d2959d8d79ead4bff5533a99a83c401112bad0935279ce12e1885a79e\" pid:5659 exited_at:{seconds:1757699909 nanos:333741305}" Sep 12 17:58:29.374824 update_engine[1609]: I20250912 17:58:29.374756 1609 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 12 17:58:29.374824 update_engine[1609]: I20250912 17:58:29.374819 1609 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 12 17:58:29.376414 update_engine[1609]: I20250912 17:58:29.376192 1609 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 12 17:58:29.377135 update_engine[1609]: I20250912 17:58:29.377111 1609 omaha_request_params.cc:62] Current group set to beta Sep 12 17:58:29.377348 update_engine[1609]: I20250912 17:58:29.377214 1609 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 12 17:58:29.377348 update_engine[1609]: I20250912 17:58:29.377224 1609 update_attempter.cc:643] Scheduling an action processor start. Sep 12 17:58:29.377348 update_engine[1609]: I20250912 17:58:29.377238 1609 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:58:29.377348 update_engine[1609]: I20250912 17:58:29.377267 1609 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 12 17:58:29.377348 update_engine[1609]: I20250912 17:58:29.377303 1609 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:58:29.377348 update_engine[1609]: I20250912 17:58:29.377312 1609 omaha_request_action.cc:272] Request: Sep 12 17:58:29.377348 update_engine[1609]: Sep 12 17:58:29.377348 update_engine[1609]: Sep 12 17:58:29.377348 update_engine[1609]: Sep 12 17:58:29.377348 update_engine[1609]: Sep 12 17:58:29.377348 update_engine[1609]: Sep 12 17:58:29.377348 update_engine[1609]: Sep 12 17:58:29.377348 update_engine[1609]: Sep 12 17:58:29.377348 update_engine[1609]: Sep 12 17:58:29.377348 update_engine[1609]: I20250912 17:58:29.377317 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:58:29.388050 update_engine[1609]: I20250912 17:58:29.385366 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:58:29.388050 update_engine[1609]: I20250912 17:58:29.386080 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:58:29.391052 update_engine[1609]: E20250912 17:58:29.389950 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:58:29.391238 update_engine[1609]: I20250912 17:58:29.391204 1609 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 12 17:58:29.411021 locksmithd[1644]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 12 17:58:29.815434 sshd[5672]: Connection closed by 139.178.89.65 port 38180 Sep 12 17:58:29.816048 sshd-session[5668]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:29.819054 systemd-logind[1603]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:58:29.819597 systemd[1]: sshd@13-139.178.70.102:22-139.178.89.65:38180.service: Deactivated successfully. Sep 12 17:58:29.821376 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:58:29.823382 systemd-logind[1603]: Removed session 16. Sep 12 17:58:34.824549 systemd[1]: Started sshd@14-139.178.70.102:22-139.178.89.65:53368.service - OpenSSH per-connection server daemon (139.178.89.65:53368). Sep 12 17:58:34.912242 sshd[5685]: Accepted publickey for core from 139.178.89.65 port 53368 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:34.913425 sshd-session[5685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:34.916245 systemd-logind[1603]: New session 17 of user core. Sep 12 17:58:34.921088 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:58:35.268447 sshd[5692]: Connection closed by 139.178.89.65 port 53368 Sep 12 17:58:35.271612 systemd[1]: sshd@14-139.178.70.102:22-139.178.89.65:53368.service: Deactivated successfully. Sep 12 17:58:35.269168 sshd-session[5685]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:35.273898 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:58:35.274894 systemd-logind[1603]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:58:35.277402 systemd-logind[1603]: Removed session 17. Sep 12 17:58:39.282377 update_engine[1609]: I20250912 17:58:39.282281 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:58:39.282377 update_engine[1609]: I20250912 17:58:39.282354 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:58:39.282705 update_engine[1609]: I20250912 17:58:39.282619 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:58:39.288155 update_engine[1609]: E20250912 17:58:39.288133 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:58:39.288207 update_engine[1609]: I20250912 17:58:39.288177 1609 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 12 17:58:40.279326 systemd[1]: Started sshd@15-139.178.70.102:22-139.178.89.65:38190.service - OpenSSH per-connection server daemon (139.178.89.65:38190). Sep 12 17:58:40.451420 sshd[5714]: Accepted publickey for core from 139.178.89.65 port 38190 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:40.452317 sshd-session[5714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:40.473154 systemd-logind[1603]: New session 18 of user core. Sep 12 17:58:40.477304 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:58:41.580158 containerd[1635]: time="2025-09-12T17:58:41.580082651Z" level=info msg="TaskExit event in podsandbox handler container_id:\"525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751\" id:\"2ab582531fbe580f2fa57807d767c09194a2a23459f0fc9798259e7cab3a92fd\" pid:5737 exited_at:{seconds:1757699921 nanos:564675922}" Sep 12 17:58:41.941650 sshd[5717]: Connection closed by 139.178.89.65 port 38190 Sep 12 17:58:41.945638 sshd-session[5714]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:41.960324 systemd[1]: Started sshd@16-139.178.70.102:22-139.178.89.65:38192.service - OpenSSH per-connection server daemon (139.178.89.65:38192). Sep 12 17:58:41.962693 systemd-logind[1603]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:58:41.963368 systemd[1]: sshd@15-139.178.70.102:22-139.178.89.65:38190.service: Deactivated successfully. Sep 12 17:58:41.972966 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:58:41.980127 systemd-logind[1603]: Removed session 18. Sep 12 17:58:42.174625 sshd[5747]: Accepted publickey for core from 139.178.89.65 port 38192 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:42.175522 sshd-session[5747]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:42.181487 systemd-logind[1603]: New session 19 of user core. Sep 12 17:58:42.185093 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:58:43.207031 sshd[5753]: Connection closed by 139.178.89.65 port 38192 Sep 12 17:58:43.214340 sshd-session[5747]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:43.219595 systemd[1]: Started sshd@17-139.178.70.102:22-139.178.89.65:38208.service - OpenSSH per-connection server daemon (139.178.89.65:38208). Sep 12 17:58:43.252543 systemd-logind[1603]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:58:43.252624 systemd[1]: sshd@16-139.178.70.102:22-139.178.89.65:38192.service: Deactivated successfully. Sep 12 17:58:43.253769 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:58:43.254620 systemd-logind[1603]: Removed session 19. Sep 12 17:58:43.330735 sshd[5760]: Accepted publickey for core from 139.178.89.65 port 38208 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:43.337247 sshd-session[5760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:43.341880 systemd-logind[1603]: New session 20 of user core. Sep 12 17:58:43.347742 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 17:58:44.406568 sshd[5766]: Connection closed by 139.178.89.65 port 38208 Sep 12 17:58:44.407748 sshd-session[5760]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:44.414762 systemd[1]: sshd@17-139.178.70.102:22-139.178.89.65:38208.service: Deactivated successfully. Sep 12 17:58:44.416136 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 17:58:44.416957 systemd-logind[1603]: Session 20 logged out. Waiting for processes to exit. Sep 12 17:58:44.419451 systemd[1]: Started sshd@18-139.178.70.102:22-139.178.89.65:38224.service - OpenSSH per-connection server daemon (139.178.89.65:38224). Sep 12 17:58:44.420722 systemd-logind[1603]: Removed session 20. Sep 12 17:58:44.486491 sshd[5779]: Accepted publickey for core from 139.178.89.65 port 38224 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:44.487911 sshd-session[5779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:44.493220 systemd-logind[1603]: New session 21 of user core. Sep 12 17:58:44.498143 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 17:58:44.763906 containerd[1635]: time="2025-09-12T17:58:44.763586498Z" level=info msg="TaskExit event in podsandbox handler container_id:\"525b8174bcc073ecc3cb8ca98af56a18982ccb4523875540f7f7c19449ee2751\" id:\"7f19e17f6c72c2e4e18a4628495e0ae0251e3cf8d14ab61555d2c3fd8365b53f\" pid:5796 exited_at:{seconds:1757699924 nanos:763397086}" Sep 12 17:58:47.386357 sshd[5783]: Connection closed by 139.178.89.65 port 38224 Sep 12 17:58:47.434180 systemd[1]: Started sshd@19-139.178.70.102:22-139.178.89.65:38226.service - OpenSSH per-connection server daemon (139.178.89.65:38226). Sep 12 17:58:47.446462 sshd-session[5779]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:47.486363 systemd[1]: sshd@18-139.178.70.102:22-139.178.89.65:38224.service: Deactivated successfully. Sep 12 17:58:47.488364 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 17:58:47.491700 systemd-logind[1603]: Session 21 logged out. Waiting for processes to exit. Sep 12 17:58:47.494286 systemd-logind[1603]: Removed session 21. Sep 12 17:58:48.015688 sshd[5839]: Accepted publickey for core from 139.178.89.65 port 38226 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:48.018205 sshd-session[5839]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:48.025225 systemd-logind[1603]: New session 22 of user core. Sep 12 17:58:48.029104 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 17:58:48.686495 containerd[1635]: time="2025-09-12T17:58:48.677710741Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" id:\"4664c0df2206a204ef0a425af7e8c38d0beff0d2817c208d03ec6a7ee5a1de39\" pid:5855 exited_at:{seconds:1757699928 nanos:676232201}" Sep 12 17:58:48.844158 containerd[1635]: time="2025-09-12T17:58:48.843910564Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7601de731eeb035c4b0ab0315bbcac4ac5b39953250c40f92e5eaaa269cf1fda\" id:\"30601a13a2f0e609d4cf7855f27f8889191bf76e139ded06b8d0974b30d4510e\" pid:5831 exited_at:{seconds:1757699928 nanos:843568422}" Sep 12 17:58:49.288042 update_engine[1609]: I20250912 17:58:49.280108 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:58:49.288042 update_engine[1609]: I20250912 17:58:49.287739 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:58:49.304037 update_engine[1609]: I20250912 17:58:49.303524 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:58:49.304636 update_engine[1609]: E20250912 17:58:49.304562 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:58:49.304636 update_engine[1609]: I20250912 17:58:49.304618 1609 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 12 17:58:49.388925 sshd[5870]: Connection closed by 139.178.89.65 port 38226 Sep 12 17:58:49.391307 sshd-session[5839]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:49.403732 systemd-logind[1603]: Session 22 logged out. Waiting for processes to exit. Sep 12 17:58:49.404167 systemd[1]: sshd@19-139.178.70.102:22-139.178.89.65:38226.service: Deactivated successfully. Sep 12 17:58:49.405934 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 17:58:49.406833 systemd-logind[1603]: Removed session 22. Sep 12 17:58:54.406259 systemd[1]: Started sshd@20-139.178.70.102:22-139.178.89.65:59558.service - OpenSSH per-connection server daemon (139.178.89.65:59558). Sep 12 17:58:54.516178 sshd[5907]: Accepted publickey for core from 139.178.89.65 port 59558 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:58:54.516889 sshd-session[5907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:54.520224 systemd-logind[1603]: New session 23 of user core. Sep 12 17:58:54.527089 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 17:58:55.138591 sshd[5910]: Connection closed by 139.178.89.65 port 59558 Sep 12 17:58:55.138962 sshd-session[5907]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:55.141392 systemd-logind[1603]: Session 23 logged out. Waiting for processes to exit. Sep 12 17:58:55.142615 systemd[1]: sshd@20-139.178.70.102:22-139.178.89.65:59558.service: Deactivated successfully. Sep 12 17:58:55.144227 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 17:58:55.146124 systemd-logind[1603]: Removed session 23. Sep 12 17:58:58.802172 containerd[1635]: time="2025-09-12T17:58:58.802129504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9907174a8e919403c478cd83207efa7c3408e450d6962c172605261c4d788df3\" id:\"9939d1e1ff40a9e145d6d76f494745760b90b02a6f7c763abd2baea1ef056048\" pid:5934 exited_at:{seconds:1757699938 nanos:798444857}" Sep 12 17:58:59.282349 update_engine[1609]: I20250912 17:58:59.282256 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:58:59.282349 update_engine[1609]: I20250912 17:58:59.282332 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:58:59.283370 update_engine[1609]: I20250912 17:58:59.283350 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:58:59.287028 update_engine[1609]: E20250912 17:58:59.286980 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:58:59.287110 update_engine[1609]: I20250912 17:58:59.287081 1609 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 17:58:59.292587 update_engine[1609]: I20250912 17:58:59.292469 1609 omaha_request_action.cc:617] Omaha request response: Sep 12 17:58:59.293054 update_engine[1609]: E20250912 17:58:59.292774 1609 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 12 17:58:59.343294 update_engine[1609]: I20250912 17:58:59.343070 1609 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 12 17:58:59.343404 update_engine[1609]: I20250912 17:58:59.343391 1609 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:58:59.343753 update_engine[1609]: I20250912 17:58:59.343740 1609 update_attempter.cc:306] Processing Done. Sep 12 17:58:59.344439 update_engine[1609]: E20250912 17:58:59.343796 1609 update_attempter.cc:619] Update failed. Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.343803 1609 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.343806 1609 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.343809 1609 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.343883 1609 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.343907 1609 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.343910 1609 omaha_request_action.cc:272] Request: Sep 12 17:58:59.344439 update_engine[1609]: Sep 12 17:58:59.344439 update_engine[1609]: Sep 12 17:58:59.344439 update_engine[1609]: Sep 12 17:58:59.344439 update_engine[1609]: Sep 12 17:58:59.344439 update_engine[1609]: Sep 12 17:58:59.344439 update_engine[1609]: Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.343914 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.343931 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:58:59.344439 update_engine[1609]: I20250912 17:58:59.344305 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:58:59.351021 update_engine[1609]: E20250912 17:58:59.348960 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:58:59.351254 update_engine[1609]: I20250912 17:58:59.351138 1609 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 17:58:59.351254 update_engine[1609]: I20250912 17:58:59.351161 1609 omaha_request_action.cc:617] Omaha request response: Sep 12 17:58:59.351254 update_engine[1609]: I20250912 17:58:59.351167 1609 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:58:59.351254 update_engine[1609]: I20250912 17:58:59.351171 1609 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:58:59.351254 update_engine[1609]: I20250912 17:58:59.351174 1609 update_attempter.cc:306] Processing Done. Sep 12 17:58:59.351254 update_engine[1609]: I20250912 17:58:59.351178 1609 update_attempter.cc:310] Error event sent. Sep 12 17:58:59.351254 update_engine[1609]: I20250912 17:58:59.351189 1609 update_check_scheduler.cc:74] Next update check in 41m42s Sep 12 17:58:59.371295 locksmithd[1644]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 12 17:58:59.371295 locksmithd[1644]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 12 17:59:00.151621 systemd[1]: Started sshd@21-139.178.70.102:22-139.178.89.65:43976.service - OpenSSH per-connection server daemon (139.178.89.65:43976). Sep 12 17:59:00.346984 sshd[5946]: Accepted publickey for core from 139.178.89.65 port 43976 ssh2: RSA SHA256:wDZNpWVsZ98foqhScMScgrvmBt+VkbaXTrAF+eax8o0 Sep 12 17:59:00.348804 sshd-session[5946]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:59:00.353131 systemd-logind[1603]: New session 24 of user core. Sep 12 17:59:00.355326 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 17:59:01.136090 sshd[5949]: Connection closed by 139.178.89.65 port 43976 Sep 12 17:59:01.136552 sshd-session[5946]: pam_unix(sshd:session): session closed for user core Sep 12 17:59:01.149224 systemd[1]: sshd@21-139.178.70.102:22-139.178.89.65:43976.service: Deactivated successfully. Sep 12 17:59:01.151195 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 17:59:01.165449 systemd-logind[1603]: Session 24 logged out. Waiting for processes to exit. Sep 12 17:59:01.166664 systemd-logind[1603]: Removed session 24.