Sep 12 22:50:41.700544 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 20:38:35 -00 2025 Sep 12 22:50:41.700561 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:50:41.700567 kernel: Disabled fast string operations Sep 12 22:50:41.700571 kernel: BIOS-provided physical RAM map: Sep 12 22:50:41.700575 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Sep 12 22:50:41.700579 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Sep 12 22:50:41.700585 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Sep 12 22:50:41.700590 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Sep 12 22:50:41.700597 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Sep 12 22:50:41.700602 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Sep 12 22:50:41.700609 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Sep 12 22:50:41.700616 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Sep 12 22:50:41.700623 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Sep 12 22:50:41.700631 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 12 22:50:41.700652 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Sep 12 22:50:41.700658 kernel: NX (Execute Disable) protection: active Sep 12 22:50:41.700663 kernel: APIC: Static calls initialized Sep 12 22:50:41.700667 kernel: SMBIOS 2.7 present. Sep 12 22:50:41.700672 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Sep 12 22:50:41.700677 kernel: DMI: Memory slots populated: 1/128 Sep 12 22:50:41.700683 kernel: vmware: hypercall mode: 0x00 Sep 12 22:50:41.700688 kernel: Hypervisor detected: VMware Sep 12 22:50:41.700693 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Sep 12 22:50:41.700697 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Sep 12 22:50:41.700702 kernel: vmware: using clock offset of 3286904862 ns Sep 12 22:50:41.700707 kernel: tsc: Detected 3408.000 MHz processor Sep 12 22:50:41.700712 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 22:50:41.700717 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 22:50:41.700722 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Sep 12 22:50:41.700727 kernel: total RAM covered: 3072M Sep 12 22:50:41.700733 kernel: Found optimal setting for mtrr clean up Sep 12 22:50:41.700740 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Sep 12 22:50:41.700746 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Sep 12 22:50:41.700750 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 22:50:41.700755 kernel: Using GB pages for direct mapping Sep 12 22:50:41.700760 kernel: ACPI: Early table checksum verification disabled Sep 12 22:50:41.700765 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Sep 12 22:50:41.700770 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Sep 12 22:50:41.700775 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Sep 12 22:50:41.700781 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Sep 12 22:50:41.700788 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 12 22:50:41.700793 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 12 22:50:41.700798 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Sep 12 22:50:41.700803 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Sep 12 22:50:41.700808 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Sep 12 22:50:41.700814 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Sep 12 22:50:41.700819 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Sep 12 22:50:41.700827 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Sep 12 22:50:41.700833 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Sep 12 22:50:41.700839 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Sep 12 22:50:41.700847 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 12 22:50:41.700852 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 12 22:50:41.700857 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Sep 12 22:50:41.700862 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Sep 12 22:50:41.700868 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Sep 12 22:50:41.700873 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Sep 12 22:50:41.700878 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Sep 12 22:50:41.700883 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Sep 12 22:50:41.700888 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 12 22:50:41.700894 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 12 22:50:41.700899 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Sep 12 22:50:41.700904 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Sep 12 22:50:41.700909 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Sep 12 22:50:41.700915 kernel: Zone ranges: Sep 12 22:50:41.700921 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 22:50:41.700926 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Sep 12 22:50:41.700931 kernel: Normal empty Sep 12 22:50:41.700936 kernel: Device empty Sep 12 22:50:41.700941 kernel: Movable zone start for each node Sep 12 22:50:41.700946 kernel: Early memory node ranges Sep 12 22:50:41.700951 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Sep 12 22:50:41.700956 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Sep 12 22:50:41.700961 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Sep 12 22:50:41.700967 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Sep 12 22:50:41.700972 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 22:50:41.700978 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Sep 12 22:50:41.700983 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Sep 12 22:50:41.700988 kernel: ACPI: PM-Timer IO Port: 0x1008 Sep 12 22:50:41.700993 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Sep 12 22:50:41.700998 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 12 22:50:41.701003 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 12 22:50:41.701008 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 12 22:50:41.701016 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 12 22:50:41.701022 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 12 22:50:41.701029 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 12 22:50:41.701038 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 12 22:50:41.701046 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 12 22:50:41.701054 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 12 22:50:41.701062 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 12 22:50:41.701067 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 12 22:50:41.701075 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 12 22:50:41.701080 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 12 22:50:41.701087 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 12 22:50:41.701092 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 12 22:50:41.701097 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 12 22:50:41.701102 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Sep 12 22:50:41.701107 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Sep 12 22:50:41.701112 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Sep 12 22:50:41.701117 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Sep 12 22:50:41.701122 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Sep 12 22:50:41.701127 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Sep 12 22:50:41.701132 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Sep 12 22:50:41.701138 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Sep 12 22:50:41.701143 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Sep 12 22:50:41.701148 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Sep 12 22:50:41.701153 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Sep 12 22:50:41.701159 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Sep 12 22:50:41.701163 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Sep 12 22:50:41.701168 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Sep 12 22:50:41.701173 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Sep 12 22:50:41.701178 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Sep 12 22:50:41.701185 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Sep 12 22:50:41.701190 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Sep 12 22:50:41.701195 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Sep 12 22:50:41.701200 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Sep 12 22:50:41.701204 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Sep 12 22:50:41.701210 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Sep 12 22:50:41.701216 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Sep 12 22:50:41.701223 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Sep 12 22:50:41.701229 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Sep 12 22:50:41.701235 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Sep 12 22:50:41.701241 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Sep 12 22:50:41.701247 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Sep 12 22:50:41.701252 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Sep 12 22:50:41.701257 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Sep 12 22:50:41.701263 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Sep 12 22:50:41.701271 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Sep 12 22:50:41.701276 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Sep 12 22:50:41.701281 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Sep 12 22:50:41.701288 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Sep 12 22:50:41.701293 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Sep 12 22:50:41.701298 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Sep 12 22:50:41.701304 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Sep 12 22:50:41.701309 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Sep 12 22:50:41.701316 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Sep 12 22:50:41.701324 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Sep 12 22:50:41.701329 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Sep 12 22:50:41.701335 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Sep 12 22:50:41.701340 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Sep 12 22:50:41.701346 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Sep 12 22:50:41.701352 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Sep 12 22:50:41.701357 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Sep 12 22:50:41.701362 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Sep 12 22:50:41.701368 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Sep 12 22:50:41.701373 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Sep 12 22:50:41.701378 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Sep 12 22:50:41.701384 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Sep 12 22:50:41.701389 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Sep 12 22:50:41.701394 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Sep 12 22:50:41.701400 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Sep 12 22:50:41.701406 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Sep 12 22:50:41.701411 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Sep 12 22:50:41.701417 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Sep 12 22:50:41.701422 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Sep 12 22:50:41.701427 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Sep 12 22:50:41.701432 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Sep 12 22:50:41.701437 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Sep 12 22:50:41.701443 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Sep 12 22:50:41.701448 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Sep 12 22:50:41.701454 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Sep 12 22:50:41.701459 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Sep 12 22:50:41.701465 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Sep 12 22:50:41.701470 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Sep 12 22:50:41.701475 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Sep 12 22:50:41.701481 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Sep 12 22:50:41.701486 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Sep 12 22:50:41.701492 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Sep 12 22:50:41.701497 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Sep 12 22:50:41.701502 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Sep 12 22:50:41.701508 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Sep 12 22:50:41.701513 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Sep 12 22:50:41.701519 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Sep 12 22:50:41.701524 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Sep 12 22:50:41.701530 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Sep 12 22:50:41.701535 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Sep 12 22:50:41.701540 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Sep 12 22:50:41.701545 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Sep 12 22:50:41.701550 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Sep 12 22:50:41.701557 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Sep 12 22:50:41.701562 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Sep 12 22:50:41.701568 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Sep 12 22:50:41.701575 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Sep 12 22:50:41.701581 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Sep 12 22:50:41.701587 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Sep 12 22:50:41.701592 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Sep 12 22:50:41.701597 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Sep 12 22:50:41.701603 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Sep 12 22:50:41.701608 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Sep 12 22:50:41.701615 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Sep 12 22:50:41.701624 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Sep 12 22:50:41.701632 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Sep 12 22:50:41.701647 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Sep 12 22:50:41.701653 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Sep 12 22:50:41.701658 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Sep 12 22:50:41.701663 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Sep 12 22:50:41.701668 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Sep 12 22:50:41.701674 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Sep 12 22:50:41.701679 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Sep 12 22:50:41.701686 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Sep 12 22:50:41.701692 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Sep 12 22:50:41.701697 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Sep 12 22:50:41.701702 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Sep 12 22:50:41.701707 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Sep 12 22:50:41.701713 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Sep 12 22:50:41.701718 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Sep 12 22:50:41.701724 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Sep 12 22:50:41.701729 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Sep 12 22:50:41.701736 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Sep 12 22:50:41.701741 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 22:50:41.701747 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Sep 12 22:50:41.701752 kernel: TSC deadline timer available Sep 12 22:50:41.701758 kernel: CPU topo: Max. logical packages: 128 Sep 12 22:50:41.701763 kernel: CPU topo: Max. logical dies: 128 Sep 12 22:50:41.701768 kernel: CPU topo: Max. dies per package: 1 Sep 12 22:50:41.701774 kernel: CPU topo: Max. threads per core: 1 Sep 12 22:50:41.701779 kernel: CPU topo: Num. cores per package: 1 Sep 12 22:50:41.701784 kernel: CPU topo: Num. threads per package: 1 Sep 12 22:50:41.701791 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Sep 12 22:50:41.701796 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Sep 12 22:50:41.701802 kernel: Booting paravirtualized kernel on VMware hypervisor Sep 12 22:50:41.701807 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 22:50:41.701813 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Sep 12 22:50:41.701818 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 12 22:50:41.701824 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 12 22:50:41.701829 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Sep 12 22:50:41.701835 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Sep 12 22:50:41.701841 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Sep 12 22:50:41.701847 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Sep 12 22:50:41.701852 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Sep 12 22:50:41.701857 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Sep 12 22:50:41.701862 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Sep 12 22:50:41.701868 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Sep 12 22:50:41.701873 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Sep 12 22:50:41.701878 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Sep 12 22:50:41.701884 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Sep 12 22:50:41.701890 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Sep 12 22:50:41.701896 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Sep 12 22:50:41.701901 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Sep 12 22:50:41.701906 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Sep 12 22:50:41.701911 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Sep 12 22:50:41.701918 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:50:41.701924 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 22:50:41.701929 kernel: random: crng init done Sep 12 22:50:41.702086 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Sep 12 22:50:41.702095 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Sep 12 22:50:41.702101 kernel: printk: log_buf_len min size: 262144 bytes Sep 12 22:50:41.702107 kernel: printk: log_buf_len: 1048576 bytes Sep 12 22:50:41.702112 kernel: printk: early log buf free: 245592(93%) Sep 12 22:50:41.702117 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 22:50:41.702123 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 22:50:41.702128 kernel: Fallback order for Node 0: 0 Sep 12 22:50:41.702134 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Sep 12 22:50:41.702141 kernel: Policy zone: DMA32 Sep 12 22:50:41.702147 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 22:50:41.702153 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Sep 12 22:50:41.702158 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 22:50:41.702164 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 22:50:41.702169 kernel: Dynamic Preempt: voluntary Sep 12 22:50:41.702175 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 22:50:41.702181 kernel: rcu: RCU event tracing is enabled. Sep 12 22:50:41.702186 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Sep 12 22:50:41.702193 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 22:50:41.702198 kernel: Rude variant of Tasks RCU enabled. Sep 12 22:50:41.702204 kernel: Tracing variant of Tasks RCU enabled. Sep 12 22:50:41.702209 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 22:50:41.702215 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Sep 12 22:50:41.702220 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 22:50:41.702226 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 22:50:41.702231 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 22:50:41.702240 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Sep 12 22:50:41.702250 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Sep 12 22:50:41.702259 kernel: Console: colour VGA+ 80x25 Sep 12 22:50:41.702269 kernel: printk: legacy console [tty0] enabled Sep 12 22:50:41.702275 kernel: printk: legacy console [ttyS0] enabled Sep 12 22:50:41.702280 kernel: ACPI: Core revision 20240827 Sep 12 22:50:41.702286 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Sep 12 22:50:41.702291 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 22:50:41.702299 kernel: x2apic enabled Sep 12 22:50:41.702307 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 22:50:41.702313 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 22:50:41.702320 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 12 22:50:41.702326 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Sep 12 22:50:41.702331 kernel: Disabled fast string operations Sep 12 22:50:41.702337 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 12 22:50:41.702342 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 12 22:50:41.702347 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 22:50:41.702353 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 12 22:50:41.702359 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 12 22:50:41.702365 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 12 22:50:41.702371 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 12 22:50:41.702377 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 22:50:41.702382 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 22:50:41.702388 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 12 22:50:41.702393 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 12 22:50:41.702398 kernel: GDS: Unknown: Dependent on hypervisor status Sep 12 22:50:41.702407 kernel: active return thunk: its_return_thunk Sep 12 22:50:41.702413 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 12 22:50:41.702420 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 22:50:41.702426 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 22:50:41.702431 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 22:50:41.702437 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 22:50:41.702442 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 22:50:41.702448 kernel: Freeing SMP alternatives memory: 32K Sep 12 22:50:41.702454 kernel: pid_max: default: 131072 minimum: 1024 Sep 12 22:50:41.702459 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 22:50:41.702465 kernel: landlock: Up and running. Sep 12 22:50:41.702471 kernel: SELinux: Initializing. Sep 12 22:50:41.702477 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 22:50:41.702482 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 22:50:41.702488 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 12 22:50:41.702493 kernel: Performance Events: Skylake events, core PMU driver. Sep 12 22:50:41.702499 kernel: core: CPUID marked event: 'cpu cycles' unavailable Sep 12 22:50:41.702504 kernel: core: CPUID marked event: 'instructions' unavailable Sep 12 22:50:41.702510 kernel: core: CPUID marked event: 'bus cycles' unavailable Sep 12 22:50:41.702515 kernel: core: CPUID marked event: 'cache references' unavailable Sep 12 22:50:41.702522 kernel: core: CPUID marked event: 'cache misses' unavailable Sep 12 22:50:41.702527 kernel: core: CPUID marked event: 'branch instructions' unavailable Sep 12 22:50:41.702535 kernel: core: CPUID marked event: 'branch misses' unavailable Sep 12 22:50:41.702541 kernel: ... version: 1 Sep 12 22:50:41.702546 kernel: ... bit width: 48 Sep 12 22:50:41.702552 kernel: ... generic registers: 4 Sep 12 22:50:41.702557 kernel: ... value mask: 0000ffffffffffff Sep 12 22:50:41.702562 kernel: ... max period: 000000007fffffff Sep 12 22:50:41.702568 kernel: ... fixed-purpose events: 0 Sep 12 22:50:41.702577 kernel: ... event mask: 000000000000000f Sep 12 22:50:41.702584 kernel: signal: max sigframe size: 1776 Sep 12 22:50:41.702590 kernel: rcu: Hierarchical SRCU implementation. Sep 12 22:50:41.702595 kernel: rcu: Max phase no-delay instances is 400. Sep 12 22:50:41.702601 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Sep 12 22:50:41.702606 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 12 22:50:41.702612 kernel: smp: Bringing up secondary CPUs ... Sep 12 22:50:41.702617 kernel: smpboot: x86: Booting SMP configuration: Sep 12 22:50:41.702623 kernel: .... node #0, CPUs: #1 Sep 12 22:50:41.702630 kernel: Disabled fast string operations Sep 12 22:50:41.702648 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 22:50:41.702654 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Sep 12 22:50:41.702660 kernel: Memory: 1924244K/2096628K available (14336K kernel code, 2432K rwdata, 9992K rodata, 54084K init, 2880K bss, 161008K reserved, 0K cma-reserved) Sep 12 22:50:41.702665 kernel: devtmpfs: initialized Sep 12 22:50:41.702671 kernel: x86/mm: Memory block size: 128MB Sep 12 22:50:41.702676 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Sep 12 22:50:41.702682 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 22:50:41.702688 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Sep 12 22:50:41.702695 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 22:50:41.702700 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 22:50:41.702706 kernel: audit: initializing netlink subsys (disabled) Sep 12 22:50:41.702711 kernel: audit: type=2000 audit(1757717438.288:1): state=initialized audit_enabled=0 res=1 Sep 12 22:50:41.702717 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 22:50:41.702722 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 22:50:41.702728 kernel: cpuidle: using governor menu Sep 12 22:50:41.702733 kernel: Simple Boot Flag at 0x36 set to 0x80 Sep 12 22:50:41.702739 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 22:50:41.702745 kernel: dca service started, version 1.12.1 Sep 12 22:50:41.702758 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Sep 12 22:50:41.702765 kernel: PCI: Using configuration type 1 for base access Sep 12 22:50:41.702771 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 22:50:41.702776 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 22:50:41.702782 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 22:50:41.702788 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 22:50:41.702794 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 22:50:41.702800 kernel: ACPI: Added _OSI(Module Device) Sep 12 22:50:41.702807 kernel: ACPI: Added _OSI(Processor Device) Sep 12 22:50:41.702812 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 22:50:41.702818 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 22:50:41.702824 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 12 22:50:41.702830 kernel: ACPI: Interpreter enabled Sep 12 22:50:41.702836 kernel: ACPI: PM: (supports S0 S1 S5) Sep 12 22:50:41.702844 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 22:50:41.702851 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 22:50:41.702856 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 22:50:41.702863 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Sep 12 22:50:41.702869 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Sep 12 22:50:41.702963 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 22:50:41.703018 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Sep 12 22:50:41.703067 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Sep 12 22:50:41.703076 kernel: PCI host bridge to bus 0000:00 Sep 12 22:50:41.703126 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 22:50:41.703406 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Sep 12 22:50:41.703457 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 12 22:50:41.703507 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 22:50:41.703561 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Sep 12 22:50:41.703606 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Sep 12 22:50:41.703704 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Sep 12 22:50:41.703769 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Sep 12 22:50:41.703821 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 22:50:41.703883 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 12 22:50:41.703940 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Sep 12 22:50:41.703995 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Sep 12 22:50:41.704054 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Sep 12 22:50:41.704119 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Sep 12 22:50:41.704170 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Sep 12 22:50:41.704220 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Sep 12 22:50:41.704279 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 12 22:50:41.704330 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Sep 12 22:50:41.704384 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Sep 12 22:50:41.704444 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Sep 12 22:50:41.704510 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Sep 12 22:50:41.704560 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Sep 12 22:50:41.704616 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Sep 12 22:50:41.704719 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Sep 12 22:50:41.706395 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Sep 12 22:50:41.706457 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Sep 12 22:50:41.706511 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Sep 12 22:50:41.706562 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 22:50:41.706622 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Sep 12 22:50:41.706684 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 12 22:50:41.706738 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 12 22:50:41.706799 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 12 22:50:41.706867 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 22:50:41.706934 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.706988 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 22:50:41.707039 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 12 22:50:41.707090 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 12 22:50:41.707141 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.707202 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.707254 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 22:50:41.707305 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 12 22:50:41.707358 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 12 22:50:41.707415 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 22:50:41.707466 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.707527 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.707593 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 22:50:41.707662 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 12 22:50:41.707717 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 12 22:50:41.707768 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 22:50:41.707818 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.707874 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.707929 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 22:50:41.707984 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 12 22:50:41.708040 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 22:50:41.708101 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.708161 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.708219 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 22:50:41.708270 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 12 22:50:41.708323 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 22:50:41.708373 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.708431 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.708483 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 22:50:41.708533 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 12 22:50:41.708583 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 22:50:41.708744 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.711711 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.711791 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 22:50:41.711852 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 12 22:50:41.711905 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 22:50:41.711956 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.712015 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.712068 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 22:50:41.712123 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 12 22:50:41.712179 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 22:50:41.712240 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.712295 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.712351 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 22:50:41.712405 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 12 22:50:41.712457 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 12 22:50:41.712507 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.712564 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.712620 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 22:50:41.713702 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 12 22:50:41.713767 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 12 22:50:41.713822 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 22:50:41.713885 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.713945 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.714000 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 22:50:41.714066 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 12 22:50:41.714124 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 12 22:50:41.714174 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 22:50:41.714235 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.714293 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.714348 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 22:50:41.714403 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 12 22:50:41.714453 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 22:50:41.714506 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.714562 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.714619 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 22:50:41.714710 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 12 22:50:41.714769 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 22:50:41.714824 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.714885 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.714942 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 22:50:41.714997 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 12 22:50:41.715048 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 22:50:41.715104 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.715164 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.715220 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 22:50:41.715276 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 12 22:50:41.715340 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 22:50:41.715406 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.715462 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.715519 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 22:50:41.715579 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 12 22:50:41.716888 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 22:50:41.716952 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.717017 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.717076 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 22:50:41.717128 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 12 22:50:41.717178 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 12 22:50:41.717240 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 22:50:41.717299 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.717360 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.717417 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 22:50:41.717496 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 12 22:50:41.717703 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 12 22:50:41.717761 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 22:50:41.718183 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.718249 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.718314 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 22:50:41.718373 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 12 22:50:41.718424 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 12 22:50:41.718479 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 22:50:41.718540 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.718609 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.719323 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 22:50:41.719382 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 12 22:50:41.719445 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 22:50:41.719498 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.719558 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.719623 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 22:50:41.720154 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 12 22:50:41.720213 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 22:50:41.720271 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.720337 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.720391 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 22:50:41.720451 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 12 22:50:41.720501 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 22:50:41.720563 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.721137 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.721210 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 22:50:41.721270 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 12 22:50:41.721323 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 22:50:41.721378 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.721444 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.721498 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 22:50:41.721562 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 12 22:50:41.721617 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 22:50:41.721684 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.721746 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.721802 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 22:50:41.721868 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 12 22:50:41.721920 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 12 22:50:41.721975 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 22:50:41.722027 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.722086 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.722138 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 22:50:41.722196 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 12 22:50:41.722260 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 12 22:50:41.722322 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 22:50:41.722391 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.722457 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.722521 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 22:50:41.722589 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 12 22:50:41.724678 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 22:50:41.724746 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.724851 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.724910 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 22:50:41.724965 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 12 22:50:41.725027 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 22:50:41.726930 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.727012 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.727097 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 22:50:41.727153 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 12 22:50:41.727210 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 22:50:41.727281 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.727345 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.727415 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 22:50:41.727487 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 12 22:50:41.727554 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 22:50:41.727616 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.727706 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.727764 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 22:50:41.727839 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 12 22:50:41.727896 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 22:50:41.727963 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.728037 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 22:50:41.728101 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 22:50:41.728159 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 12 22:50:41.728234 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 22:50:41.728287 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.728365 kernel: pci_bus 0000:01: extended config space not accessible Sep 12 22:50:41.728429 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 22:50:41.728488 kernel: pci_bus 0000:02: extended config space not accessible Sep 12 22:50:41.728503 kernel: acpiphp: Slot [32] registered Sep 12 22:50:41.728510 kernel: acpiphp: Slot [33] registered Sep 12 22:50:41.728517 kernel: acpiphp: Slot [34] registered Sep 12 22:50:41.728527 kernel: acpiphp: Slot [35] registered Sep 12 22:50:41.728536 kernel: acpiphp: Slot [36] registered Sep 12 22:50:41.728547 kernel: acpiphp: Slot [37] registered Sep 12 22:50:41.728555 kernel: acpiphp: Slot [38] registered Sep 12 22:50:41.728561 kernel: acpiphp: Slot [39] registered Sep 12 22:50:41.728566 kernel: acpiphp: Slot [40] registered Sep 12 22:50:41.728574 kernel: acpiphp: Slot [41] registered Sep 12 22:50:41.728579 kernel: acpiphp: Slot [42] registered Sep 12 22:50:41.728585 kernel: acpiphp: Slot [43] registered Sep 12 22:50:41.728591 kernel: acpiphp: Slot [44] registered Sep 12 22:50:41.728597 kernel: acpiphp: Slot [45] registered Sep 12 22:50:41.728603 kernel: acpiphp: Slot [46] registered Sep 12 22:50:41.728609 kernel: acpiphp: Slot [47] registered Sep 12 22:50:41.728614 kernel: acpiphp: Slot [48] registered Sep 12 22:50:41.728620 kernel: acpiphp: Slot [49] registered Sep 12 22:50:41.728627 kernel: acpiphp: Slot [50] registered Sep 12 22:50:41.728633 kernel: acpiphp: Slot [51] registered Sep 12 22:50:41.733727 kernel: acpiphp: Slot [52] registered Sep 12 22:50:41.733739 kernel: acpiphp: Slot [53] registered Sep 12 22:50:41.733748 kernel: acpiphp: Slot [54] registered Sep 12 22:50:41.733756 kernel: acpiphp: Slot [55] registered Sep 12 22:50:41.733762 kernel: acpiphp: Slot [56] registered Sep 12 22:50:41.733771 kernel: acpiphp: Slot [57] registered Sep 12 22:50:41.733777 kernel: acpiphp: Slot [58] registered Sep 12 22:50:41.733783 kernel: acpiphp: Slot [59] registered Sep 12 22:50:41.733792 kernel: acpiphp: Slot [60] registered Sep 12 22:50:41.733798 kernel: acpiphp: Slot [61] registered Sep 12 22:50:41.733804 kernel: acpiphp: Slot [62] registered Sep 12 22:50:41.733810 kernel: acpiphp: Slot [63] registered Sep 12 22:50:41.733894 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 12 22:50:41.733963 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Sep 12 22:50:41.734024 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Sep 12 22:50:41.734083 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Sep 12 22:50:41.734139 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Sep 12 22:50:41.734198 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Sep 12 22:50:41.734258 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Sep 12 22:50:41.734326 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Sep 12 22:50:41.734388 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Sep 12 22:50:41.734446 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 12 22:50:41.734504 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 12 22:50:41.734568 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 12 22:50:41.734631 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 22:50:41.734707 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 22:50:41.734770 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 22:50:41.734829 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 22:50:41.734890 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 22:50:41.734950 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 22:50:41.735012 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 22:50:41.735076 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 22:50:41.735156 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Sep 12 22:50:41.735220 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Sep 12 22:50:41.735284 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Sep 12 22:50:41.735343 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Sep 12 22:50:41.735400 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Sep 12 22:50:41.735457 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 12 22:50:41.735529 kernel: pci 0000:0b:00.0: supports D1 D2 Sep 12 22:50:41.735597 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 22:50:41.737973 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 12 22:50:41.738044 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 22:50:41.738105 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 22:50:41.738169 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 22:50:41.738244 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 22:50:41.738304 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 22:50:41.738357 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 22:50:41.738423 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 22:50:41.738488 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 22:50:41.738547 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 22:50:41.738612 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 22:50:41.738689 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 22:50:41.738755 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 22:50:41.738826 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 22:50:41.738888 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 22:50:41.738943 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 22:50:41.739003 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 22:50:41.739069 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 22:50:41.739132 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 22:50:41.739190 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 22:50:41.739257 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 22:50:41.739310 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 22:50:41.739362 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 22:50:41.739421 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 22:50:41.739484 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 22:50:41.739493 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Sep 12 22:50:41.739502 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Sep 12 22:50:41.739510 kernel: ACPI: PCI: Interrupt link LNKB disabled Sep 12 22:50:41.739519 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 22:50:41.739528 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Sep 12 22:50:41.739537 kernel: iommu: Default domain type: Translated Sep 12 22:50:41.739546 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 22:50:41.739555 kernel: PCI: Using ACPI for IRQ routing Sep 12 22:50:41.739561 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 22:50:41.739567 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Sep 12 22:50:41.739573 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Sep 12 22:50:41.739627 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Sep 12 22:50:41.739714 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Sep 12 22:50:41.739775 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 22:50:41.739784 kernel: vgaarb: loaded Sep 12 22:50:41.739791 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Sep 12 22:50:41.739797 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Sep 12 22:50:41.739803 kernel: clocksource: Switched to clocksource tsc-early Sep 12 22:50:41.739810 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 22:50:41.739818 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 22:50:41.739827 kernel: pnp: PnP ACPI init Sep 12 22:50:41.739895 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Sep 12 22:50:41.739967 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Sep 12 22:50:41.740025 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Sep 12 22:50:41.740078 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Sep 12 22:50:41.740142 kernel: pnp 00:06: [dma 2] Sep 12 22:50:41.740205 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Sep 12 22:50:41.740268 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Sep 12 22:50:41.740334 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Sep 12 22:50:41.740348 kernel: pnp: PnP ACPI: found 8 devices Sep 12 22:50:41.740357 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 22:50:41.740363 kernel: NET: Registered PF_INET protocol family Sep 12 22:50:41.740369 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 22:50:41.740385 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 12 22:50:41.740392 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 22:50:41.740407 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 22:50:41.740415 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 22:50:41.740424 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 12 22:50:41.740435 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 22:50:41.740444 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 22:50:41.740450 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 22:50:41.740457 kernel: NET: Registered PF_XDP protocol family Sep 12 22:50:41.740529 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Sep 12 22:50:41.740592 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 12 22:50:41.742676 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 22:50:41.742764 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 22:50:41.742843 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 22:50:41.742911 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Sep 12 22:50:41.742998 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Sep 12 22:50:41.743066 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Sep 12 22:50:41.743130 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Sep 12 22:50:41.743195 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Sep 12 22:50:41.743257 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Sep 12 22:50:41.743314 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Sep 12 22:50:41.743380 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Sep 12 22:50:41.743453 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Sep 12 22:50:41.743524 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Sep 12 22:50:41.743588 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Sep 12 22:50:41.746046 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Sep 12 22:50:41.746128 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Sep 12 22:50:41.746194 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Sep 12 22:50:41.746255 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Sep 12 22:50:41.746320 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Sep 12 22:50:41.746381 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Sep 12 22:50:41.746446 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Sep 12 22:50:41.746509 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Sep 12 22:50:41.746576 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Sep 12 22:50:41.746652 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.746710 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.746774 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.746830 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.746899 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.746962 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.747024 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.747080 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.747141 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.747201 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.747265 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.747328 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.747396 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.747478 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.747544 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.747603 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.749963 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.750043 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.750111 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.750172 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.750232 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.750292 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.750359 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.750423 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.750483 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.750543 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.750597 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.750670 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.750734 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.750791 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.750854 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.750914 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.750974 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.751028 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.751089 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.751152 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.751220 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.751292 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.751354 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.751412 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.751470 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.751534 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.751594 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.752064 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.752130 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.752194 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.752267 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.752324 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.752390 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.752452 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.752512 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.752583 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.752666 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.752737 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.752795 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.752852 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.752927 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.752988 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.753052 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.753112 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.753167 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.753221 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.753290 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.753349 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.753410 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.753485 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.753543 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.753612 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.753844 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.753919 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.753980 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.754205 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.754279 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.754338 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.754413 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.754483 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.754542 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.754601 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.754687 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.754743 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.754800 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.754875 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.754938 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 22:50:41.754998 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 12 22:50:41.755321 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 22:50:41.755394 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Sep 12 22:50:41.755459 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 12 22:50:41.755525 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 12 22:50:41.755582 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 22:50:41.755676 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Sep 12 22:50:41.755753 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 22:50:41.755812 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 12 22:50:41.755878 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 12 22:50:41.755937 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Sep 12 22:50:41.755995 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 22:50:41.756053 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 12 22:50:41.756118 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 12 22:50:41.756177 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 22:50:41.756237 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 22:50:41.756303 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 12 22:50:41.756366 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 12 22:50:41.756423 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 22:50:41.756479 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 22:50:41.756535 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 12 22:50:41.756593 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 22:50:41.758392 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 22:50:41.758467 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 12 22:50:41.758542 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 22:50:41.758613 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 22:50:41.758700 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 12 22:50:41.758761 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 22:50:41.758821 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 22:50:41.758879 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 12 22:50:41.758938 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 22:50:41.758994 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 22:50:41.759067 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 12 22:50:41.759151 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 22:50:41.759214 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Sep 12 22:50:41.759272 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 22:50:41.759324 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 12 22:50:41.759375 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 12 22:50:41.759430 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Sep 12 22:50:41.759484 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 22:50:41.759543 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 12 22:50:41.759615 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 12 22:50:41.761720 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 22:50:41.761795 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 22:50:41.761853 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 12 22:50:41.761914 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 12 22:50:41.761975 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 22:50:41.762034 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 22:50:41.762084 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 12 22:50:41.762136 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 22:50:41.762203 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 22:50:41.762255 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 12 22:50:41.762319 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 22:50:41.762401 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 22:50:41.762473 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 12 22:50:41.762536 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 22:50:41.762591 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 22:50:41.762673 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 12 22:50:41.762736 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 22:50:41.762788 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 22:50:41.762839 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 12 22:50:41.762895 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 22:50:41.762963 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 22:50:41.763015 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 12 22:50:41.763072 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 12 22:50:41.763130 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 22:50:41.763192 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 22:50:41.763252 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 12 22:50:41.763313 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 12 22:50:41.763380 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 22:50:41.763439 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 22:50:41.763493 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 12 22:50:41.763543 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 12 22:50:41.763603 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 22:50:41.764121 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 22:50:41.764193 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 12 22:50:41.764278 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 22:50:41.764347 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 22:50:41.764401 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 12 22:50:41.764453 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 22:50:41.764510 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 22:50:41.764574 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 12 22:50:41.765665 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 22:50:41.765764 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 22:50:41.765834 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 12 22:50:41.765900 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 22:50:41.765962 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 22:50:41.766027 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 12 22:50:41.766085 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 22:50:41.766150 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 22:50:41.766232 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 12 22:50:41.766300 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 12 22:50:41.766380 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 22:50:41.766449 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 22:50:41.766515 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 12 22:50:41.766582 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 12 22:50:41.766663 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 22:50:41.766735 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 22:50:41.767227 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 12 22:50:41.767297 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 22:50:41.767369 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 22:50:41.767455 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 12 22:50:41.767525 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 22:50:41.767602 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 22:50:41.767730 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 12 22:50:41.767799 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 22:50:41.767864 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 22:50:41.767940 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 12 22:50:41.768011 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 22:50:41.768079 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 22:50:41.768144 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 12 22:50:41.768206 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 22:50:41.768275 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 22:50:41.768341 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 12 22:50:41.768410 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 22:50:41.768481 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Sep 12 22:50:41.768540 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 12 22:50:41.768598 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 12 22:50:41.768669 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Sep 12 22:50:41.768726 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Sep 12 22:50:41.768800 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Sep 12 22:50:41.768862 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Sep 12 22:50:41.768920 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 22:50:41.768992 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Sep 12 22:50:41.769056 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 12 22:50:41.769116 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 12 22:50:41.769177 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Sep 12 22:50:41.769241 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Sep 12 22:50:41.769308 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Sep 12 22:50:41.769365 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Sep 12 22:50:41.769433 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Sep 12 22:50:41.769498 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Sep 12 22:50:41.771105 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Sep 12 22:50:41.771155 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 22:50:41.771231 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Sep 12 22:50:41.771295 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Sep 12 22:50:41.771348 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 22:50:41.771403 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Sep 12 22:50:41.771474 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 22:50:41.771548 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Sep 12 22:50:41.771621 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 22:50:41.771723 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Sep 12 22:50:41.771771 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 22:50:41.771831 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Sep 12 22:50:41.771886 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 22:50:41.771937 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Sep 12 22:50:41.771983 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 22:50:41.772039 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Sep 12 22:50:41.772090 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Sep 12 22:50:41.772145 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Sep 12 22:50:41.772217 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Sep 12 22:50:41.772273 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Sep 12 22:50:41.772322 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 22:50:41.772379 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Sep 12 22:50:41.772433 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Sep 12 22:50:41.772479 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 22:50:41.772531 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Sep 12 22:50:41.772593 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 22:50:41.773500 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Sep 12 22:50:41.773571 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 22:50:41.773654 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Sep 12 22:50:41.773733 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 22:50:41.773817 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Sep 12 22:50:41.773891 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 22:50:41.773950 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Sep 12 22:50:41.774017 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 22:50:41.774087 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Sep 12 22:50:41.774153 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Sep 12 22:50:41.774209 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 22:50:41.774275 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Sep 12 22:50:41.774343 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Sep 12 22:50:41.774391 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 22:50:41.774455 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Sep 12 22:50:41.774517 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Sep 12 22:50:41.774569 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 22:50:41.774658 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Sep 12 22:50:41.774709 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 22:50:41.774769 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Sep 12 22:50:41.774828 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 22:50:41.774897 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Sep 12 22:50:41.774969 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 22:50:41.775042 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Sep 12 22:50:41.775095 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 22:50:41.775155 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Sep 12 22:50:41.775217 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 22:50:41.775278 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Sep 12 22:50:41.775345 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Sep 12 22:50:41.775404 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 22:50:41.775456 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Sep 12 22:50:41.775503 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Sep 12 22:50:41.775575 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 22:50:41.776828 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Sep 12 22:50:41.776900 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 22:50:41.776979 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Sep 12 22:50:41.777034 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 22:50:41.777111 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Sep 12 22:50:41.777172 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 22:50:41.777238 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Sep 12 22:50:41.777300 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 22:50:41.777372 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Sep 12 22:50:41.777437 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 22:50:41.777498 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Sep 12 22:50:41.777563 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 22:50:41.777646 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 12 22:50:41.777663 kernel: PCI: CLS 32 bytes, default 64 Sep 12 22:50:41.777670 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 12 22:50:41.777681 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 12 22:50:41.777691 kernel: clocksource: Switched to clocksource tsc Sep 12 22:50:41.777700 kernel: Initialise system trusted keyrings Sep 12 22:50:41.777710 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 12 22:50:41.777720 kernel: Key type asymmetric registered Sep 12 22:50:41.777726 kernel: Asymmetric key parser 'x509' registered Sep 12 22:50:41.777734 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 22:50:41.777740 kernel: io scheduler mq-deadline registered Sep 12 22:50:41.777747 kernel: io scheduler kyber registered Sep 12 22:50:41.777756 kernel: io scheduler bfq registered Sep 12 22:50:41.777820 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Sep 12 22:50:41.777877 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.777959 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Sep 12 22:50:41.778013 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.778088 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Sep 12 22:50:41.778161 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.778233 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Sep 12 22:50:41.778306 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.778361 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Sep 12 22:50:41.778431 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.778508 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Sep 12 22:50:41.778592 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.778699 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Sep 12 22:50:41.778754 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.778810 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Sep 12 22:50:41.778886 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.778957 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Sep 12 22:50:41.779013 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.779073 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Sep 12 22:50:41.779135 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.779203 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Sep 12 22:50:41.779267 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.779338 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Sep 12 22:50:41.779412 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.779483 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Sep 12 22:50:41.779545 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.779622 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Sep 12 22:50:41.779700 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.779766 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Sep 12 22:50:41.779837 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.779890 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Sep 12 22:50:41.779957 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.780040 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Sep 12 22:50:41.780103 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.780169 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Sep 12 22:50:41.780240 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.780293 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Sep 12 22:50:41.780373 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.780445 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Sep 12 22:50:41.780523 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.780586 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Sep 12 22:50:41.780674 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.780747 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Sep 12 22:50:41.780802 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.780862 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Sep 12 22:50:41.780921 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.780996 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Sep 12 22:50:41.781069 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.781131 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Sep 12 22:50:41.781188 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.781264 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Sep 12 22:50:41.781339 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.781410 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Sep 12 22:50:41.781475 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.781538 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Sep 12 22:50:41.781591 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.781708 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Sep 12 22:50:41.781782 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.781854 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Sep 12 22:50:41.781928 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.781994 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Sep 12 22:50:41.782047 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.782112 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Sep 12 22:50:41.782186 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 22:50:41.782198 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 22:50:41.782206 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 22:50:41.782214 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 22:50:41.782221 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Sep 12 22:50:41.782227 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 22:50:41.782234 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 22:50:41.782297 kernel: rtc_cmos 00:01: registered as rtc0 Sep 12 22:50:41.782380 kernel: rtc_cmos 00:01: setting system clock to 2025-09-12T22:50:41 UTC (1757717441) Sep 12 22:50:41.782458 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Sep 12 22:50:41.782470 kernel: intel_pstate: CPU model not supported Sep 12 22:50:41.782476 kernel: NET: Registered PF_INET6 protocol family Sep 12 22:50:41.782483 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 22:50:41.782489 kernel: Segment Routing with IPv6 Sep 12 22:50:41.782495 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 22:50:41.782505 kernel: NET: Registered PF_PACKET protocol family Sep 12 22:50:41.782518 kernel: Key type dns_resolver registered Sep 12 22:50:41.782528 kernel: IPI shorthand broadcast: enabled Sep 12 22:50:41.782535 kernel: sched_clock: Marking stable (2721004048, 170614096)->(2905698011, -14079867) Sep 12 22:50:41.782541 kernel: registered taskstats version 1 Sep 12 22:50:41.782547 kernel: Loading compiled-in X.509 certificates Sep 12 22:50:41.782554 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: c3297a5801573420030c321362a802da1fd49c4e' Sep 12 22:50:41.782560 kernel: Demotion targets for Node 0: null Sep 12 22:50:41.782566 kernel: Key type .fscrypt registered Sep 12 22:50:41.782572 kernel: Key type fscrypt-provisioning registered Sep 12 22:50:41.782581 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 22:50:41.782591 kernel: ima: Allocated hash algorithm: sha1 Sep 12 22:50:41.782604 kernel: ima: No architecture policies found Sep 12 22:50:41.782615 kernel: clk: Disabling unused clocks Sep 12 22:50:41.782626 kernel: Warning: unable to open an initial console. Sep 12 22:50:41.782651 kernel: Freeing unused kernel image (initmem) memory: 54084K Sep 12 22:50:41.782659 kernel: Write protecting the kernel read-only data: 24576k Sep 12 22:50:41.782666 kernel: Freeing unused kernel image (rodata/data gap) memory: 248K Sep 12 22:50:41.782674 kernel: Run /init as init process Sep 12 22:50:41.782680 kernel: with arguments: Sep 12 22:50:41.782687 kernel: /init Sep 12 22:50:41.782693 kernel: with environment: Sep 12 22:50:41.782701 kernel: HOME=/ Sep 12 22:50:41.782707 kernel: TERM=linux Sep 12 22:50:41.782714 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 22:50:41.782721 systemd[1]: Successfully made /usr/ read-only. Sep 12 22:50:41.782729 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:50:41.782738 systemd[1]: Detected virtualization vmware. Sep 12 22:50:41.782744 systemd[1]: Detected architecture x86-64. Sep 12 22:50:41.782750 systemd[1]: Running in initrd. Sep 12 22:50:41.782756 systemd[1]: No hostname configured, using default hostname. Sep 12 22:50:41.782763 systemd[1]: Hostname set to . Sep 12 22:50:41.782770 systemd[1]: Initializing machine ID from random generator. Sep 12 22:50:41.782776 systemd[1]: Queued start job for default target initrd.target. Sep 12 22:50:41.782782 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:50:41.782790 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:50:41.782797 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 22:50:41.782804 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:50:41.782810 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 22:50:41.782817 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 22:50:41.782824 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 22:50:41.782834 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 22:50:41.782845 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:50:41.782857 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:50:41.782868 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:50:41.782879 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:50:41.782886 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:50:41.782895 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:50:41.782903 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:50:41.782912 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:50:41.782920 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 22:50:41.782927 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 22:50:41.782934 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:50:41.782943 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:50:41.782954 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:50:41.782965 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:50:41.782976 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 22:50:41.782987 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:50:41.782997 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 22:50:41.783004 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 22:50:41.783011 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 22:50:41.783018 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:50:41.783024 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:50:41.783031 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:50:41.783037 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 22:50:41.783065 systemd-journald[244]: Collecting audit messages is disabled. Sep 12 22:50:41.783092 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:50:41.783103 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 22:50:41.783115 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:50:41.783125 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:50:41.783132 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 22:50:41.783139 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:50:41.783146 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:50:41.783153 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 22:50:41.783160 kernel: Bridge firewalling registered Sep 12 22:50:41.783168 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:50:41.783175 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:50:41.783182 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:50:41.783189 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:50:41.783196 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 22:50:41.783204 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:50:41.783212 systemd-journald[244]: Journal started Sep 12 22:50:41.783228 systemd-journald[244]: Runtime Journal (/run/log/journal/da39c86cf2bb43dd9b271749f16ac9e3) is 4.8M, max 38.8M, 34M free. Sep 12 22:50:41.717989 systemd-modules-load[245]: Inserted module 'overlay' Sep 12 22:50:41.751680 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 12 22:50:41.784771 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:50:41.785731 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:50:41.794098 dracut-cmdline[271]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:50:41.794868 systemd-tmpfiles[275]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 22:50:41.797570 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:50:41.798953 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:50:41.833333 systemd-resolved[300]: Positive Trust Anchors: Sep 12 22:50:41.833342 systemd-resolved[300]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:50:41.833365 systemd-resolved[300]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:50:41.835693 systemd-resolved[300]: Defaulting to hostname 'linux'. Sep 12 22:50:41.836718 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:50:41.837158 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:50:41.858654 kernel: SCSI subsystem initialized Sep 12 22:50:41.874649 kernel: Loading iSCSI transport class v2.0-870. Sep 12 22:50:41.882650 kernel: iscsi: registered transport (tcp) Sep 12 22:50:41.904648 kernel: iscsi: registered transport (qla4xxx) Sep 12 22:50:41.904677 kernel: QLogic iSCSI HBA Driver Sep 12 22:50:41.915115 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:50:41.921415 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:50:41.922547 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:50:41.944143 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 22:50:41.945223 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 22:50:41.994658 kernel: raid6: avx2x4 gen() 38058 MB/s Sep 12 22:50:42.011662 kernel: raid6: avx2x2 gen() 45023 MB/s Sep 12 22:50:42.028980 kernel: raid6: avx2x1 gen() 36782 MB/s Sep 12 22:50:42.029028 kernel: raid6: using algorithm avx2x2 gen() 45023 MB/s Sep 12 22:50:42.047000 kernel: raid6: .... xor() 26658 MB/s, rmw enabled Sep 12 22:50:42.047085 kernel: raid6: using avx2x2 recovery algorithm Sep 12 22:50:42.061654 kernel: xor: automatically using best checksumming function avx Sep 12 22:50:42.169654 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 22:50:42.172880 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:50:42.174008 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:50:42.194606 systemd-udevd[492]: Using default interface naming scheme 'v255'. Sep 12 22:50:42.198185 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:50:42.199235 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 22:50:42.219833 dracut-pre-trigger[499]: rd.md=0: removing MD RAID activation Sep 12 22:50:42.235605 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:50:42.236715 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:50:42.331492 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:50:42.332447 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 22:50:42.408666 kernel: VMware PVSCSI driver - version 1.0.7.0-k Sep 12 22:50:42.408708 kernel: vmw_pvscsi: using 64bit dma Sep 12 22:50:42.408718 kernel: vmw_pvscsi: max_id: 16 Sep 12 22:50:42.408725 kernel: vmw_pvscsi: setting ring_pages to 8 Sep 12 22:50:42.431396 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Sep 12 22:50:42.431433 kernel: libata version 3.00 loaded. Sep 12 22:50:42.431447 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Sep 12 22:50:42.434775 kernel: vmw_pvscsi: enabling reqCallThreshold Sep 12 22:50:42.434796 kernel: vmw_pvscsi: driver-based request coalescing enabled Sep 12 22:50:42.434804 kernel: vmw_pvscsi: using MSI-X Sep 12 22:50:42.438339 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Sep 12 22:50:42.438455 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Sep 12 22:50:42.440641 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Sep 12 22:50:42.444747 kernel: ata_piix 0000:00:07.1: version 2.13 Sep 12 22:50:42.446660 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Sep 12 22:50:42.446752 kernel: scsi host1: ata_piix Sep 12 22:50:42.450573 kernel: scsi host2: ata_piix Sep 12 22:50:42.450683 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Sep 12 22:50:42.450694 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Sep 12 22:50:42.458664 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 22:50:42.463138 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Sep 12 22:50:42.463282 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Sep 12 22:50:42.462405 (udev-worker)[547]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 12 22:50:42.465839 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:50:42.469924 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Sep 12 22:50:42.470048 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 22:50:42.470131 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Sep 12 22:50:42.470217 kernel: sd 0:0:0:0: [sda] Cache data unavailable Sep 12 22:50:42.470310 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Sep 12 22:50:42.465926 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:50:42.469915 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:50:42.471225 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:50:42.490657 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:50:42.490706 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 22:50:42.493948 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:50:42.621668 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Sep 12 22:50:42.625655 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Sep 12 22:50:42.637163 kernel: AES CTR mode by8 optimization enabled Sep 12 22:50:42.660831 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Sep 12 22:50:42.660986 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 22:50:42.673647 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 12 22:50:42.713508 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 12 22:50:42.774109 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Sep 12 22:50:42.783351 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Sep 12 22:50:42.820399 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Sep 12 22:50:42.820609 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Sep 12 22:50:42.821505 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 22:50:42.906659 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:50:43.246024 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 22:50:43.246424 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:50:43.246572 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:50:43.246802 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:50:43.247558 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 22:50:43.263094 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:50:43.988657 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:50:43.988787 disk-uuid[643]: The operation has completed successfully. Sep 12 22:50:44.119868 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 22:50:44.119940 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 22:50:44.120792 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 22:50:44.139199 sh[670]: Success Sep 12 22:50:44.161064 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 22:50:44.161108 kernel: device-mapper: uevent: version 1.0.3 Sep 12 22:50:44.161139 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 22:50:44.169650 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 12 22:50:44.287348 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 22:50:44.289692 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 22:50:44.304302 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 22:50:44.469677 kernel: BTRFS: device fsid 5d2ab445-1154-4e47-9d7e-ff4b81d84474 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (682) Sep 12 22:50:44.471787 kernel: BTRFS info (device dm-0): first mount of filesystem 5d2ab445-1154-4e47-9d7e-ff4b81d84474 Sep 12 22:50:44.471809 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:50:44.577272 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 22:50:44.577338 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 22:50:44.577350 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 22:50:44.596146 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 22:50:44.596501 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:50:44.597437 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Sep 12 22:50:44.598707 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 22:50:44.711664 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (705) Sep 12 22:50:44.727931 kernel: BTRFS info (device sda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:50:44.727962 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:50:44.792137 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:50:44.792184 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:50:44.795649 kernel: BTRFS info (device sda6): last unmount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:50:44.796814 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 22:50:44.797610 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 22:50:44.906096 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 12 22:50:44.907352 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 22:50:44.978356 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:50:44.979703 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:50:45.010077 systemd-networkd[858]: lo: Link UP Sep 12 22:50:45.010082 systemd-networkd[858]: lo: Gained carrier Sep 12 22:50:45.010853 systemd-networkd[858]: Enumeration completed Sep 12 22:50:45.011102 systemd-networkd[858]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Sep 12 22:50:45.011689 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:50:45.012069 systemd[1]: Reached target network.target - Network. Sep 12 22:50:45.014927 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 12 22:50:45.015060 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 12 22:50:45.015246 systemd-networkd[858]: ens192: Link UP Sep 12 22:50:45.015251 systemd-networkd[858]: ens192: Gained carrier Sep 12 22:50:45.117713 ignition[725]: Ignition 2.22.0 Sep 12 22:50:45.117722 ignition[725]: Stage: fetch-offline Sep 12 22:50:45.117742 ignition[725]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:50:45.117747 ignition[725]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 22:50:45.117805 ignition[725]: parsed url from cmdline: "" Sep 12 22:50:45.117808 ignition[725]: no config URL provided Sep 12 22:50:45.117812 ignition[725]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:50:45.117820 ignition[725]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:50:45.118293 ignition[725]: config successfully fetched Sep 12 22:50:45.118317 ignition[725]: parsing config with SHA512: 1536b76ae62f0f627c24dafcd19fba936420126722749ea60a12c6bb7c7a557c4b3ab8cfe473d3ce4ab46f16390cd3a9fffa9920f7f47ad26ac77c371f54c9c4 Sep 12 22:50:45.122469 unknown[725]: fetched base config from "system" Sep 12 22:50:45.122708 unknown[725]: fetched user config from "vmware" Sep 12 22:50:45.123023 ignition[725]: fetch-offline: fetch-offline passed Sep 12 22:50:45.123071 ignition[725]: Ignition finished successfully Sep 12 22:50:45.124399 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:50:45.124690 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 22:50:45.125260 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 22:50:45.152971 ignition[869]: Ignition 2.22.0 Sep 12 22:50:45.152981 ignition[869]: Stage: kargs Sep 12 22:50:45.153067 ignition[869]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:50:45.153072 ignition[869]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 22:50:45.153672 ignition[869]: kargs: kargs passed Sep 12 22:50:45.153698 ignition[869]: Ignition finished successfully Sep 12 22:50:45.155304 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 22:50:45.156286 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 22:50:45.177332 ignition[876]: Ignition 2.22.0 Sep 12 22:50:45.177603 ignition[876]: Stage: disks Sep 12 22:50:45.177715 ignition[876]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:50:45.177721 ignition[876]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 22:50:45.178377 ignition[876]: disks: disks passed Sep 12 22:50:45.178403 ignition[876]: Ignition finished successfully Sep 12 22:50:45.179274 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 22:50:45.179690 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 22:50:45.179824 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 22:50:45.180041 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:50:45.180242 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:50:45.180421 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:50:45.181154 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 22:50:45.314239 systemd-fsck[884]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 12 22:50:45.320235 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 22:50:45.321320 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 22:50:45.589662 kernel: EXT4-fs (sda9): mounted filesystem d027afc5-396a-49bf-a5be-60ddd42cb089 r/w with ordered data mode. Quota mode: none. Sep 12 22:50:45.589874 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 22:50:45.590351 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 22:50:45.591662 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:50:45.592681 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 22:50:45.593218 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 22:50:45.594742 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 22:50:45.595019 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:50:45.606510 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 22:50:45.608731 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 22:50:45.613620 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (892) Sep 12 22:50:45.613662 kernel: BTRFS info (device sda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:50:45.613677 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:50:45.623760 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:50:45.623810 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:50:45.624736 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:50:45.958677 initrd-setup-root[916]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 22:50:45.961672 initrd-setup-root[923]: cut: /sysroot/etc/group: No such file or directory Sep 12 22:50:45.964914 initrd-setup-root[930]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 22:50:45.973650 initrd-setup-root[937]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 22:50:46.342749 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 22:50:46.343604 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 22:50:46.344701 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 22:50:46.356607 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 22:50:46.358656 kernel: BTRFS info (device sda6): last unmount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:50:46.374217 ignition[1005]: INFO : Ignition 2.22.0 Sep 12 22:50:46.374217 ignition[1005]: INFO : Stage: mount Sep 12 22:50:46.374581 ignition[1005]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:50:46.374581 ignition[1005]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 22:50:46.374846 ignition[1005]: INFO : mount: mount passed Sep 12 22:50:46.375452 ignition[1005]: INFO : Ignition finished successfully Sep 12 22:50:46.375678 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 22:50:46.376576 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 22:50:46.391460 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:50:46.491656 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1012) Sep 12 22:50:46.496918 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 22:50:46.502037 kernel: BTRFS info (device sda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:50:46.502074 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:50:46.541659 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:50:46.541712 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:50:46.543266 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:50:46.565037 ignition[1032]: INFO : Ignition 2.22.0 Sep 12 22:50:46.565037 ignition[1032]: INFO : Stage: files Sep 12 22:50:46.565422 ignition[1032]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:50:46.565422 ignition[1032]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 22:50:46.565750 ignition[1032]: DEBUG : files: compiled without relabeling support, skipping Sep 12 22:50:46.574687 ignition[1032]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 22:50:46.574687 ignition[1032]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 22:50:46.602073 ignition[1032]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 22:50:46.602305 ignition[1032]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 22:50:46.602472 ignition[1032]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 22:50:46.602326 unknown[1032]: wrote ssh authorized keys file for user: core Sep 12 22:50:46.653593 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 22:50:46.653998 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 12 22:50:46.746646 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 22:50:46.846844 systemd-networkd[858]: ens192: Gained IPv6LL Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:50:47.394560 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:50:47.411237 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:50:47.411570 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:50:47.411570 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 22:50:47.428551 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 22:50:47.428551 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 22:50:47.429160 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 12 22:50:47.952029 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 22:50:48.297144 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 22:50:48.297144 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 12 22:50:48.361252 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 12 22:50:48.361252 ignition[1032]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Sep 12 22:50:48.361748 ignition[1032]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:50:48.364832 ignition[1032]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:50:48.364832 ignition[1032]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Sep 12 22:50:48.364832 ignition[1032]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Sep 12 22:50:48.364832 ignition[1032]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 22:50:48.364832 ignition[1032]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 22:50:48.364832 ignition[1032]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Sep 12 22:50:48.364832 ignition[1032]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 22:50:49.053365 ignition[1032]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 22:50:49.056185 ignition[1032]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 22:50:49.056369 ignition[1032]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 22:50:49.056369 ignition[1032]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Sep 12 22:50:49.056369 ignition[1032]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 22:50:49.057691 ignition[1032]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:50:49.057691 ignition[1032]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:50:49.057691 ignition[1032]: INFO : files: files passed Sep 12 22:50:49.057691 ignition[1032]: INFO : Ignition finished successfully Sep 12 22:50:49.058404 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 22:50:49.059623 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 22:50:49.060723 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 22:50:49.073881 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:50:49.073881 initrd-setup-root-after-ignition[1062]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:50:49.075174 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:50:49.076144 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:50:49.076596 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 22:50:49.077411 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 22:50:49.118018 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 22:50:49.118093 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 22:50:49.118379 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 22:50:49.118533 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 22:50:49.120545 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 22:50:49.120609 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 22:50:49.120976 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 22:50:49.121607 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 22:50:49.133932 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:50:49.134864 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 22:50:49.151186 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:50:49.151647 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:50:49.152075 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 22:50:49.152448 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 22:50:49.152699 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:50:49.153236 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 22:50:49.153626 systemd[1]: Stopped target basic.target - Basic System. Sep 12 22:50:49.153972 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 22:50:49.154314 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:50:49.154677 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 22:50:49.154994 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:50:49.155361 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 22:50:49.155726 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:50:49.156071 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 22:50:49.156435 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 22:50:49.156774 systemd[1]: Stopped target swap.target - Swaps. Sep 12 22:50:49.157060 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 22:50:49.157266 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:50:49.157727 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:50:49.158079 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:50:49.158415 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 22:50:49.158621 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:50:49.158940 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 22:50:49.159007 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 22:50:49.159502 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 22:50:49.159579 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:50:49.160137 systemd[1]: Stopped target paths.target - Path Units. Sep 12 22:50:49.160295 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 22:50:49.163657 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:50:49.163845 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 22:50:49.164004 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 22:50:49.164167 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 22:50:49.164230 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:50:49.164395 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 22:50:49.164460 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:50:49.164666 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 22:50:49.164752 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:50:49.165000 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 22:50:49.165068 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 22:50:49.166740 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 22:50:49.167304 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 22:50:49.167436 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 22:50:49.167519 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:50:49.167711 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 22:50:49.167768 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:50:49.171992 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 22:50:49.179692 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 22:50:49.189025 ignition[1089]: INFO : Ignition 2.22.0 Sep 12 22:50:49.189025 ignition[1089]: INFO : Stage: umount Sep 12 22:50:49.189391 ignition[1089]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:50:49.189391 ignition[1089]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 22:50:49.189673 ignition[1089]: INFO : umount: umount passed Sep 12 22:50:49.189673 ignition[1089]: INFO : Ignition finished successfully Sep 12 22:50:49.190492 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 22:50:49.190579 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 22:50:49.190826 systemd[1]: Stopped target network.target - Network. Sep 12 22:50:49.190940 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 22:50:49.190969 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 22:50:49.191111 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 22:50:49.191132 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 22:50:49.191290 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 22:50:49.191311 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 22:50:49.191460 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 22:50:49.191480 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 22:50:49.191749 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 22:50:49.192073 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 22:50:49.196834 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 22:50:49.196923 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 22:50:49.198460 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 22:50:49.198631 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 22:50:49.199284 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:50:49.200184 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:50:49.200329 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 22:50:49.200384 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 22:50:49.201369 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 22:50:49.201566 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 22:50:49.201804 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 22:50:49.201823 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:50:49.202461 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 22:50:49.202563 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 22:50:49.202594 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:50:49.202744 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Sep 12 22:50:49.202766 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 12 22:50:49.202888 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 22:50:49.202909 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:50:49.203063 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 22:50:49.203084 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 22:50:49.203214 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:50:49.204343 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 22:50:49.212392 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 22:50:49.212785 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:50:49.213065 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 22:50:49.213090 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 22:50:49.213210 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 22:50:49.213226 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:50:49.213338 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 22:50:49.213362 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:50:49.213535 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 22:50:49.213559 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 22:50:49.213817 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 22:50:49.213839 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:50:49.214715 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 22:50:49.214908 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 22:50:49.214936 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:50:49.215915 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 22:50:49.215944 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:50:49.216229 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 22:50:49.216255 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:50:49.216432 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 22:50:49.216456 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:50:49.217850 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:50:49.217877 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:50:49.219083 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 22:50:49.219127 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 22:50:49.219149 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 22:50:49.219171 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:50:49.228292 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 22:50:49.228556 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 22:50:49.228994 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 22:50:49.229196 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 22:50:49.615227 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 22:50:49.615303 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 22:50:49.615908 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 22:50:49.616209 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 22:50:49.616258 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 22:50:49.617115 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 22:50:49.636256 systemd[1]: Switching root. Sep 12 22:50:49.675644 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 12 22:50:49.675700 systemd-journald[244]: Journal stopped Sep 12 22:50:53.589654 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 22:50:53.589674 kernel: SELinux: policy capability open_perms=1 Sep 12 22:50:53.589681 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 22:50:53.589687 kernel: SELinux: policy capability always_check_network=0 Sep 12 22:50:53.589692 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 22:50:53.589697 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 22:50:53.589705 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 22:50:53.589710 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 22:50:53.589716 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 22:50:53.589722 kernel: audit: type=1403 audit(1757717451.019:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 22:50:53.589730 systemd[1]: Successfully loaded SELinux policy in 57.828ms. Sep 12 22:50:53.589737 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.508ms. Sep 12 22:50:53.589745 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:50:53.589751 systemd[1]: Detected virtualization vmware. Sep 12 22:50:53.589758 systemd[1]: Detected architecture x86-64. Sep 12 22:50:53.589764 systemd[1]: Detected first boot. Sep 12 22:50:53.589772 systemd[1]: Initializing machine ID from random generator. Sep 12 22:50:53.589779 zram_generator::config[1133]: No configuration found. Sep 12 22:50:53.589860 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Sep 12 22:50:53.589871 kernel: Guest personality initialized and is active Sep 12 22:50:53.589877 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 22:50:53.589883 kernel: Initialized host personality Sep 12 22:50:53.589889 kernel: NET: Registered PF_VSOCK protocol family Sep 12 22:50:53.589897 systemd[1]: Populated /etc with preset unit settings. Sep 12 22:50:53.589904 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 22:50:53.589911 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Sep 12 22:50:53.589918 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 22:50:53.589925 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 22:50:53.589931 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 22:50:53.589938 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 22:50:53.589947 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 22:50:53.589954 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 22:50:53.589960 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 22:50:53.589967 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 22:50:53.589974 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 22:50:53.589981 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 22:50:53.589987 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 22:50:53.589994 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 22:50:53.590002 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:50:53.590009 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:50:53.590017 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 22:50:53.590024 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 22:50:53.590031 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 22:50:53.590038 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:50:53.590044 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 22:50:53.590052 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:50:53.590059 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:50:53.590066 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 22:50:53.590072 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 22:50:53.590079 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 22:50:53.590086 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 22:50:53.590093 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:50:53.590103 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:50:53.590117 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:50:53.590128 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:50:53.590156 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 22:50:53.590164 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 22:50:53.590171 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 22:50:53.590179 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:50:53.590186 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:50:53.590193 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:50:53.590200 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 22:50:53.590207 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 22:50:53.590214 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 22:50:53.590221 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 22:50:53.590228 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:50:53.590237 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 22:50:53.590244 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 22:50:53.590251 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 22:50:53.590258 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 22:50:53.590265 systemd[1]: Reached target machines.target - Containers. Sep 12 22:50:53.590274 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 22:50:53.590281 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Sep 12 22:50:53.590288 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:50:53.590295 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 22:50:53.590303 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:50:53.590311 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:50:53.590318 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:50:53.590325 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 22:50:53.590332 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:50:53.590339 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 22:50:53.590346 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 22:50:53.590353 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 22:50:53.590361 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 22:50:53.590368 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 22:50:53.590375 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:50:53.590382 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:50:53.590389 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:50:53.590396 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:50:53.590403 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 22:50:53.590410 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 22:50:53.590419 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:50:53.590427 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 22:50:53.590434 systemd[1]: Stopped verity-setup.service. Sep 12 22:50:53.590441 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:50:53.590448 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 22:50:53.590455 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 22:50:53.590462 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 22:50:53.590469 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 22:50:53.590476 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 22:50:53.590484 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 22:50:53.590491 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:50:53.590498 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:50:53.590505 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:50:53.590512 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:50:53.590519 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:50:53.590526 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 22:50:53.590533 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 22:50:53.590541 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 22:50:53.590548 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 22:50:53.590556 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 22:50:53.590563 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:50:53.590570 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 22:50:53.590577 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 22:50:53.590584 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:50:53.590591 kernel: fuse: init (API version 7.41) Sep 12 22:50:53.590599 kernel: loop: module loaded Sep 12 22:50:53.590618 systemd-journald[1216]: Collecting audit messages is disabled. Sep 12 22:50:53.590646 systemd-journald[1216]: Journal started Sep 12 22:50:53.590663 systemd-journald[1216]: Runtime Journal (/run/log/journal/20b8ecd0e7ae4cdca1953ec2b6c2a122) is 4.8M, max 38.8M, 34M free. Sep 12 22:50:53.297075 systemd[1]: Queued start job for default target multi-user.target. Sep 12 22:50:53.303558 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 22:50:53.303802 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 22:50:53.591131 jq[1203]: true Sep 12 22:50:53.608117 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 22:50:53.608148 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:50:53.612703 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 22:50:53.614678 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 22:50:53.617672 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:50:53.619650 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:50:53.621852 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 22:50:53.621989 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 22:50:53.622800 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:50:53.622913 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:50:53.623208 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:50:53.623472 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:50:53.623758 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 22:50:53.623935 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 22:50:53.633089 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:50:53.634820 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 22:50:53.637730 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 22:50:53.637881 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:50:53.645298 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:50:53.646349 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 22:50:53.653851 jq[1234]: true Sep 12 22:50:53.664136 systemd-journald[1216]: Time spent on flushing to /var/log/journal/20b8ecd0e7ae4cdca1953ec2b6c2a122 is 29.061ms for 1760 entries. Sep 12 22:50:53.664136 systemd-journald[1216]: System Journal (/var/log/journal/20b8ecd0e7ae4cdca1953ec2b6c2a122) is 8M, max 584.8M, 576.8M free. Sep 12 22:50:54.358931 systemd-journald[1216]: Received client request to flush runtime journal. Sep 12 22:50:54.358984 kernel: ACPI: bus type drm_connector registered Sep 12 22:50:54.359004 kernel: loop0: detected capacity change from 0 to 110984 Sep 12 22:50:54.359018 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 22:50:53.716233 ignition[1258]: Ignition 2.22.0 Sep 12 22:50:53.706415 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:50:53.716470 ignition[1258]: deleting config from guestinfo properties Sep 12 22:50:53.706580 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:50:53.918893 ignition[1258]: Successfully deleted config Sep 12 22:50:53.917714 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:50:53.920504 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Sep 12 22:50:53.938244 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Sep 12 22:50:53.938255 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Sep 12 22:50:53.942777 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:50:53.949699 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:50:53.951667 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 22:50:53.952118 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 22:50:53.953929 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 22:50:53.991945 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 22:50:53.993294 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 22:50:54.359665 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 22:50:54.460661 kernel: loop1: detected capacity change from 0 to 128016 Sep 12 22:50:54.507144 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 22:50:54.510430 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:50:54.511303 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 22:50:54.512429 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 22:50:54.538968 systemd-tmpfiles[1303]: ACLs are not supported, ignoring. Sep 12 22:50:54.538982 systemd-tmpfiles[1303]: ACLs are not supported, ignoring. Sep 12 22:50:54.541662 kernel: loop2: detected capacity change from 0 to 221472 Sep 12 22:50:54.543418 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:50:54.703872 kernel: loop3: detected capacity change from 0 to 2960 Sep 12 22:50:54.880654 kernel: loop4: detected capacity change from 0 to 110984 Sep 12 22:50:55.159657 kernel: loop5: detected capacity change from 0 to 128016 Sep 12 22:50:55.247930 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 22:50:55.249317 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:50:55.269299 systemd-udevd[1312]: Using default interface naming scheme 'v255'. Sep 12 22:50:55.377659 kernel: loop6: detected capacity change from 0 to 221472 Sep 12 22:50:55.595961 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:50:55.599791 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:50:55.618902 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 22:50:55.627654 kernel: loop7: detected capacity change from 0 to 2960 Sep 12 22:50:55.680929 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 22:50:55.690087 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 22:50:55.742659 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 12 22:50:55.746659 kernel: ACPI: button: Power Button [PWRF] Sep 12 22:50:55.750669 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 22:50:55.794729 systemd-networkd[1321]: lo: Link UP Sep 12 22:50:55.794735 systemd-networkd[1321]: lo: Gained carrier Sep 12 22:50:55.795562 systemd-networkd[1321]: Enumeration completed Sep 12 22:50:55.795626 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:50:55.796680 systemd-networkd[1321]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Sep 12 22:50:55.797213 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 22:50:55.802817 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 12 22:50:55.802968 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 12 22:50:55.800375 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 22:50:55.802998 systemd-networkd[1321]: ens192: Link UP Sep 12 22:50:55.803088 systemd-networkd[1321]: ens192: Gained carrier Sep 12 22:50:55.834184 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 22:50:55.843357 (sd-merge)[1310]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Sep 12 22:50:55.843632 (sd-merge)[1310]: Merged extensions into '/usr'. Sep 12 22:50:55.844651 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Sep 12 22:50:55.857420 systemd[1]: Reload requested from client PID 1232 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 22:50:55.857430 systemd[1]: Reloading... Sep 12 22:50:55.922650 zram_generator::config[1399]: No configuration found. Sep 12 22:50:55.955400 (udev-worker)[1329]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 12 22:50:56.044693 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 22:50:56.103562 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 12 22:50:56.103904 systemd[1]: Reloading finished in 246 ms. Sep 12 22:50:56.121657 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 22:50:56.137731 systemd[1]: Starting ensure-sysext.service... Sep 12 22:50:56.139709 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 22:50:56.141814 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:50:56.144661 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:50:56.168505 systemd[1]: Reload requested from client PID 1457 ('systemctl') (unit ensure-sysext.service)... Sep 12 22:50:56.168606 systemd[1]: Reloading... Sep 12 22:50:56.168987 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 22:50:56.169016 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 22:50:56.169196 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 22:50:56.169406 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 22:50:56.170531 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 22:50:56.170758 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Sep 12 22:50:56.170795 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Sep 12 22:50:56.203937 systemd-tmpfiles[1459]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:50:56.203944 systemd-tmpfiles[1459]: Skipping /boot Sep 12 22:50:56.211755 systemd-tmpfiles[1459]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:50:56.211763 systemd-tmpfiles[1459]: Skipping /boot Sep 12 22:50:56.219649 zram_generator::config[1496]: No configuration found. Sep 12 22:50:56.297754 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 22:50:56.369846 systemd[1]: Reloading finished in 200 ms. Sep 12 22:50:56.384849 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 22:50:56.385167 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:50:56.385435 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:50:56.390866 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:50:56.407081 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 22:50:56.408276 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 22:50:56.412501 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:50:56.416826 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 22:50:56.418623 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:50:56.419388 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:50:56.420782 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:50:56.421423 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:50:56.421572 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:50:56.421642 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:50:56.421711 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:50:56.424571 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:50:56.424819 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:50:56.424877 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:50:56.424932 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:50:56.427892 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:50:56.430856 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:50:56.431050 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:50:56.431119 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:50:56.431215 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:50:56.439701 systemd[1]: Finished ensure-sysext.service. Sep 12 22:50:56.443927 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 22:50:56.444568 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 22:50:56.445613 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:50:56.445749 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:50:56.445996 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:50:56.446098 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:50:56.448298 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:50:56.454484 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:50:56.454643 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:50:56.455112 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:50:56.461122 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:50:56.461785 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:50:56.487116 systemd-resolved[1560]: Positive Trust Anchors: Sep 12 22:50:56.487588 systemd-resolved[1560]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:50:56.487658 systemd-resolved[1560]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:50:56.495431 systemd-resolved[1560]: Defaulting to hostname 'linux'. Sep 12 22:50:56.496691 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:50:56.496888 systemd[1]: Reached target network.target - Network. Sep 12 22:50:56.496990 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:50:56.499214 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 22:50:56.499457 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 22:50:56.538876 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 22:50:56.539507 augenrules[1595]: No rules Sep 12 22:50:56.540160 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:50:56.540388 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:52:29.551235 systemd-timesyncd[1574]: Contacted time server 172.233.157.223:123 (0.flatcar.pool.ntp.org). Sep 12 22:52:29.551261 systemd-resolved[1560]: Clock change detected. Flushing caches. Sep 12 22:52:29.551275 systemd-timesyncd[1574]: Initial clock synchronization to Fri 2025-09-12 22:52:29.551161 UTC. Sep 12 22:52:30.051611 ldconfig[1229]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 22:52:30.054023 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 22:52:30.055291 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 22:52:30.069023 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 22:52:30.069506 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 22:52:30.072781 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 22:52:30.073003 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:52:30.073162 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 22:52:30.073282 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 22:52:30.073391 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 22:52:30.073564 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 22:52:30.073699 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 22:52:30.073806 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 22:52:30.073910 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 22:52:30.073931 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:52:30.074012 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:52:30.080839 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 22:52:30.081969 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 22:52:30.083673 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 22:52:30.083919 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 22:52:30.084089 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 22:52:30.090323 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 22:52:30.090662 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 22:52:30.091259 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 22:52:30.091917 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:52:30.092064 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:52:30.092216 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:52:30.092240 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:52:30.093067 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 22:52:30.096152 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 22:52:30.097573 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 22:52:30.099125 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 22:52:30.100303 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 22:52:30.100681 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 22:52:30.106130 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 22:52:30.106980 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 22:52:30.109072 jq[1610]: false Sep 12 22:52:30.109266 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 22:52:30.111185 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 22:52:30.117130 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 22:52:30.119380 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 22:52:30.119978 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 22:52:30.120438 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 22:52:30.122743 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 22:52:30.123706 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 22:52:30.126240 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Sep 12 22:52:30.131514 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 22:52:30.131756 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 22:52:30.132064 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 22:52:30.135085 jq[1622]: true Sep 12 22:52:30.135466 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 22:52:30.135599 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 22:52:30.135899 google_oslogin_nss_cache[1612]: oslogin_cache_refresh[1612]: Refreshing passwd entry cache Sep 12 22:52:30.136079 oslogin_cache_refresh[1612]: Refreshing passwd entry cache Sep 12 22:52:30.142268 extend-filesystems[1611]: Found /dev/sda6 Sep 12 22:52:30.143500 google_oslogin_nss_cache[1612]: oslogin_cache_refresh[1612]: Failure getting users, quitting Sep 12 22:52:30.143500 google_oslogin_nss_cache[1612]: oslogin_cache_refresh[1612]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 22:52:30.143500 google_oslogin_nss_cache[1612]: oslogin_cache_refresh[1612]: Refreshing group entry cache Sep 12 22:52:30.142942 oslogin_cache_refresh[1612]: Failure getting users, quitting Sep 12 22:52:30.142959 oslogin_cache_refresh[1612]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 22:52:30.142981 oslogin_cache_refresh[1612]: Refreshing group entry cache Sep 12 22:52:30.147586 google_oslogin_nss_cache[1612]: oslogin_cache_refresh[1612]: Failure getting groups, quitting Sep 12 22:52:30.147586 google_oslogin_nss_cache[1612]: oslogin_cache_refresh[1612]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 22:52:30.147104 oslogin_cache_refresh[1612]: Failure getting groups, quitting Sep 12 22:52:30.147110 oslogin_cache_refresh[1612]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 22:52:30.147866 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 22:52:30.149106 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 22:52:30.150480 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 22:52:30.150816 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 22:52:30.159344 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Sep 12 22:52:30.159421 (ntainerd)[1640]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 22:52:30.162117 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Sep 12 22:52:30.165578 extend-filesystems[1611]: Found /dev/sda9 Sep 12 22:52:30.167402 extend-filesystems[1611]: Checking size of /dev/sda9 Sep 12 22:52:30.167817 jq[1634]: true Sep 12 22:52:30.168691 update_engine[1621]: I20250912 22:52:30.168476 1621 main.cc:92] Flatcar Update Engine starting Sep 12 22:52:30.188908 extend-filesystems[1611]: Old size kept for /dev/sda9 Sep 12 22:52:30.189685 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 22:52:30.191603 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 22:52:30.201554 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Sep 12 22:52:30.202360 tar[1628]: linux-amd64/helm Sep 12 22:52:30.220707 unknown[1648]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Sep 12 22:52:30.222153 unknown[1648]: Core dump limit set to -1 Sep 12 22:52:30.230127 dbus-daemon[1608]: [system] SELinux support is enabled Sep 12 22:52:30.230664 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 22:52:30.232672 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 22:52:30.232693 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 22:52:30.233018 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 22:52:30.233029 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 22:52:30.237862 systemd-logind[1620]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 22:52:30.237877 systemd-logind[1620]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 22:52:30.239282 systemd-logind[1620]: New seat seat0. Sep 12 22:52:30.242213 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 22:52:30.246732 update_engine[1621]: I20250912 22:52:30.246618 1621 update_check_scheduler.cc:74] Next update check in 9m11s Sep 12 22:52:30.247595 systemd[1]: Started update-engine.service - Update Engine. Sep 12 22:52:30.258834 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 22:52:30.364565 bash[1676]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:52:30.365881 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 22:52:30.366591 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 22:52:30.488485 locksmithd[1677]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 22:52:30.556390 tar[1628]: linux-amd64/LICENSE Sep 12 22:52:30.556390 tar[1628]: linux-amd64/README.md Sep 12 22:52:30.565929 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 22:52:30.589747 containerd[1640]: time="2025-09-12T22:52:30Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 22:52:30.590318 containerd[1640]: time="2025-09-12T22:52:30.590297061Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 22:52:30.599036 containerd[1640]: time="2025-09-12T22:52:30.599011311Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="454.111µs" Sep 12 22:52:30.599036 containerd[1640]: time="2025-09-12T22:52:30.599032033Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 22:52:30.599036 containerd[1640]: time="2025-09-12T22:52:30.599050584Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599132813Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599144064Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599159392Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599191554Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599198354Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599307237Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599317190Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599326968Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599335185Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599377505Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600051 containerd[1640]: time="2025-09-12T22:52:30.599485523Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600220 containerd[1640]: time="2025-09-12T22:52:30.599507506Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:52:30.600220 containerd[1640]: time="2025-09-12T22:52:30.599514039Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 22:52:30.600220 containerd[1640]: time="2025-09-12T22:52:30.600061952Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 22:52:30.600220 containerd[1640]: time="2025-09-12T22:52:30.600207650Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 22:52:30.600272 containerd[1640]: time="2025-09-12T22:52:30.600253430Z" level=info msg="metadata content store policy set" policy=shared Sep 12 22:52:30.686191 systemd-networkd[1321]: ens192: Gained IPv6LL Sep 12 22:52:30.688095 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 22:52:30.688593 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 22:52:30.689834 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Sep 12 22:52:30.702965 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:52:30.704218 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 22:52:30.708953 containerd[1640]: time="2025-09-12T22:52:30.708921456Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 22:52:30.709028 containerd[1640]: time="2025-09-12T22:52:30.708980339Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 22:52:30.709028 containerd[1640]: time="2025-09-12T22:52:30.708998151Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 22:52:30.709028 containerd[1640]: time="2025-09-12T22:52:30.709006979Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 22:52:30.709028 containerd[1640]: time="2025-09-12T22:52:30.709015677Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709082966Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709115865Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709135252Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709148183Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709160853Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709170123Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709181123Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709266120Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709283646Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709302962Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709315455Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709326777Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709338600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709346577Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 22:52:30.709888 containerd[1640]: time="2025-09-12T22:52:30.709352525Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 22:52:30.710210 containerd[1640]: time="2025-09-12T22:52:30.709361640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 22:52:30.710210 containerd[1640]: time="2025-09-12T22:52:30.709371329Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 22:52:30.710210 containerd[1640]: time="2025-09-12T22:52:30.709382421Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 22:52:30.710210 containerd[1640]: time="2025-09-12T22:52:30.709449165Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 22:52:30.710210 containerd[1640]: time="2025-09-12T22:52:30.709463616Z" level=info msg="Start snapshots syncer" Sep 12 22:52:30.710210 containerd[1640]: time="2025-09-12T22:52:30.709499936Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 22:52:30.710322 containerd[1640]: time="2025-09-12T22:52:30.709743152Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 22:52:30.710322 containerd[1640]: time="2025-09-12T22:52:30.709825796Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.709912311Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710029951Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710069715Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710091059Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710108432Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710120343Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710127849Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710134177Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710148144Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710156189Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710163219Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710191206Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710207885Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:52:30.710401 containerd[1640]: time="2025-09-12T22:52:30.710218086Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710225050Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710229552Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710234934Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710271032Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710282081Z" level=info msg="runtime interface created" Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710285199Z" level=info msg="created NRI interface" Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710289786Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710296878Z" level=info msg="Connect containerd service" Sep 12 22:52:30.710616 containerd[1640]: time="2025-09-12T22:52:30.710339097Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 22:52:30.713318 sshd_keygen[1647]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 22:52:30.717098 containerd[1640]: time="2025-09-12T22:52:30.716977326Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:52:30.737278 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 22:52:30.739587 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 22:52:30.741427 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 22:52:30.754226 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 22:52:30.754385 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 22:52:30.757182 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 22:52:30.770992 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 22:52:30.771217 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Sep 12 22:52:30.771742 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 22:52:30.779081 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 22:52:30.781210 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 22:52:30.783547 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 22:52:30.783762 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 22:52:30.871383 containerd[1640]: time="2025-09-12T22:52:30.871346657Z" level=info msg="Start subscribing containerd event" Sep 12 22:52:30.871460 containerd[1640]: time="2025-09-12T22:52:30.871390587Z" level=info msg="Start recovering state" Sep 12 22:52:30.871460 containerd[1640]: time="2025-09-12T22:52:30.871453807Z" level=info msg="Start event monitor" Sep 12 22:52:30.871518 containerd[1640]: time="2025-09-12T22:52:30.871464333Z" level=info msg="Start cni network conf syncer for default" Sep 12 22:52:30.871518 containerd[1640]: time="2025-09-12T22:52:30.871472945Z" level=info msg="Start streaming server" Sep 12 22:52:30.871518 containerd[1640]: time="2025-09-12T22:52:30.871484399Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 22:52:30.871518 containerd[1640]: time="2025-09-12T22:52:30.871490516Z" level=info msg="runtime interface starting up..." Sep 12 22:52:30.871518 containerd[1640]: time="2025-09-12T22:52:30.871494953Z" level=info msg="starting plugins..." Sep 12 22:52:30.871518 containerd[1640]: time="2025-09-12T22:52:30.871505134Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 22:52:30.871749 containerd[1640]: time="2025-09-12T22:52:30.871734372Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 22:52:30.871845 containerd[1640]: time="2025-09-12T22:52:30.871833412Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 22:52:30.871944 containerd[1640]: time="2025-09-12T22:52:30.871933824Z" level=info msg="containerd successfully booted in 0.282373s" Sep 12 22:52:30.872011 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 22:52:32.258099 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:52:32.258546 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 22:52:32.259210 systemd[1]: Startup finished in 2.758s (kernel) + 9.418s (initrd) + 8.335s (userspace) = 20.513s. Sep 12 22:52:32.268298 (kubelet)[1805]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:52:32.307500 login[1792]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 22:52:32.308687 login[1795]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 22:52:32.318065 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 22:52:32.319348 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 22:52:32.321265 systemd-logind[1620]: New session 2 of user core. Sep 12 22:52:32.325351 systemd-logind[1620]: New session 1 of user core. Sep 12 22:52:32.336052 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 22:52:32.337803 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 22:52:32.352094 (systemd)[1812]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 22:52:32.354344 systemd-logind[1620]: New session c1 of user core. Sep 12 22:52:32.458542 systemd[1812]: Queued start job for default target default.target. Sep 12 22:52:32.462913 systemd[1812]: Created slice app.slice - User Application Slice. Sep 12 22:52:32.462937 systemd[1812]: Reached target paths.target - Paths. Sep 12 22:52:32.462963 systemd[1812]: Reached target timers.target - Timers. Sep 12 22:52:32.463750 systemd[1812]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 22:52:32.475498 systemd[1812]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 22:52:32.475676 systemd[1812]: Reached target sockets.target - Sockets. Sep 12 22:52:32.475754 systemd[1812]: Reached target basic.target - Basic System. Sep 12 22:52:32.475814 systemd[1812]: Reached target default.target - Main User Target. Sep 12 22:52:32.475864 systemd[1812]: Startup finished in 116ms. Sep 12 22:52:32.475981 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 22:52:32.477257 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 22:52:32.478311 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 22:52:32.847137 kubelet[1805]: E0912 22:52:32.847102 1805 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:52:32.848651 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:52:32.848794 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:52:32.849152 systemd[1]: kubelet.service: Consumed 696ms CPU time, 263.6M memory peak. Sep 12 22:52:43.099682 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 22:52:43.103980 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:52:43.476319 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:52:43.487416 (kubelet)[1855]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:52:43.534902 kubelet[1855]: E0912 22:52:43.534857 1855 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:52:43.538583 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:52:43.538754 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:52:43.539320 systemd[1]: kubelet.service: Consumed 134ms CPU time, 110.7M memory peak. Sep 12 22:52:53.704354 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 22:52:53.707150 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:52:54.147315 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:52:54.149949 (kubelet)[1870]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:52:54.202421 kubelet[1870]: E0912 22:52:54.202388 1870 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:52:54.204004 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:52:54.204109 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:52:54.204528 systemd[1]: kubelet.service: Consumed 94ms CPU time, 108.7M memory peak. Sep 12 22:53:00.572206 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 22:53:00.573131 systemd[1]: Started sshd@0-139.178.70.110:22-147.75.109.163:46170.service - OpenSSH per-connection server daemon (147.75.109.163:46170). Sep 12 22:53:00.630744 sshd[1878]: Accepted publickey for core from 147.75.109.163 port 46170 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:53:00.631687 sshd-session[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:53:00.636056 systemd-logind[1620]: New session 3 of user core. Sep 12 22:53:00.643245 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 22:53:00.698276 systemd[1]: Started sshd@1-139.178.70.110:22-147.75.109.163:46182.service - OpenSSH per-connection server daemon (147.75.109.163:46182). Sep 12 22:53:00.742977 sshd[1884]: Accepted publickey for core from 147.75.109.163 port 46182 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:53:00.743881 sshd-session[1884]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:53:00.747385 systemd-logind[1620]: New session 4 of user core. Sep 12 22:53:00.751235 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 22:53:00.801292 sshd[1887]: Connection closed by 147.75.109.163 port 46182 Sep 12 22:53:00.802257 sshd-session[1884]: pam_unix(sshd:session): session closed for user core Sep 12 22:53:00.811579 systemd[1]: sshd@1-139.178.70.110:22-147.75.109.163:46182.service: Deactivated successfully. Sep 12 22:53:00.812808 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 22:53:00.813713 systemd-logind[1620]: Session 4 logged out. Waiting for processes to exit. Sep 12 22:53:00.815078 systemd[1]: Started sshd@2-139.178.70.110:22-147.75.109.163:46194.service - OpenSSH per-connection server daemon (147.75.109.163:46194). Sep 12 22:53:00.815685 systemd-logind[1620]: Removed session 4. Sep 12 22:53:00.855388 sshd[1893]: Accepted publickey for core from 147.75.109.163 port 46194 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:53:00.855996 sshd-session[1893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:53:00.858622 systemd-logind[1620]: New session 5 of user core. Sep 12 22:53:00.868169 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 22:53:00.914558 sshd[1896]: Connection closed by 147.75.109.163 port 46194 Sep 12 22:53:00.914866 sshd-session[1893]: pam_unix(sshd:session): session closed for user core Sep 12 22:53:00.923600 systemd[1]: sshd@2-139.178.70.110:22-147.75.109.163:46194.service: Deactivated successfully. Sep 12 22:53:00.924678 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 22:53:00.925891 systemd-logind[1620]: Session 5 logged out. Waiting for processes to exit. Sep 12 22:53:00.926477 systemd[1]: Started sshd@3-139.178.70.110:22-147.75.109.163:46210.service - OpenSSH per-connection server daemon (147.75.109.163:46210). Sep 12 22:53:00.927514 systemd-logind[1620]: Removed session 5. Sep 12 22:53:00.968929 sshd[1902]: Accepted publickey for core from 147.75.109.163 port 46210 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:53:00.969700 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:53:00.973324 systemd-logind[1620]: New session 6 of user core. Sep 12 22:53:00.984211 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 22:53:01.031778 sshd[1905]: Connection closed by 147.75.109.163 port 46210 Sep 12 22:53:01.032636 sshd-session[1902]: pam_unix(sshd:session): session closed for user core Sep 12 22:53:01.036196 systemd[1]: sshd@3-139.178.70.110:22-147.75.109.163:46210.service: Deactivated successfully. Sep 12 22:53:01.037001 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 22:53:01.037743 systemd-logind[1620]: Session 6 logged out. Waiting for processes to exit. Sep 12 22:53:01.038626 systemd[1]: Started sshd@4-139.178.70.110:22-147.75.109.163:46214.service - OpenSSH per-connection server daemon (147.75.109.163:46214). Sep 12 22:53:01.040210 systemd-logind[1620]: Removed session 6. Sep 12 22:53:01.078551 sshd[1911]: Accepted publickey for core from 147.75.109.163 port 46214 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:53:01.079316 sshd-session[1911]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:53:01.082202 systemd-logind[1620]: New session 7 of user core. Sep 12 22:53:01.089142 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 22:53:01.230576 sudo[1915]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 22:53:01.230783 sudo[1915]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:53:01.253565 sudo[1915]: pam_unix(sudo:session): session closed for user root Sep 12 22:53:01.255520 sshd[1914]: Connection closed by 147.75.109.163 port 46214 Sep 12 22:53:01.254876 sshd-session[1911]: pam_unix(sshd:session): session closed for user core Sep 12 22:53:01.265747 systemd[1]: sshd@4-139.178.70.110:22-147.75.109.163:46214.service: Deactivated successfully. Sep 12 22:53:01.267762 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 22:53:01.268474 systemd-logind[1620]: Session 7 logged out. Waiting for processes to exit. Sep 12 22:53:01.270842 systemd[1]: Started sshd@5-139.178.70.110:22-147.75.109.163:46220.service - OpenSSH per-connection server daemon (147.75.109.163:46220). Sep 12 22:53:01.271838 systemd-logind[1620]: Removed session 7. Sep 12 22:53:01.311732 sshd[1921]: Accepted publickey for core from 147.75.109.163 port 46220 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:53:01.312494 sshd-session[1921]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:53:01.315097 systemd-logind[1620]: New session 8 of user core. Sep 12 22:53:01.330149 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 22:53:01.379154 sudo[1926]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 22:53:01.379666 sudo[1926]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:53:01.382965 sudo[1926]: pam_unix(sudo:session): session closed for user root Sep 12 22:53:01.387240 sudo[1925]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 22:53:01.387481 sudo[1925]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:53:01.395554 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:53:01.420632 augenrules[1948]: No rules Sep 12 22:53:01.421512 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:53:01.421811 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:53:01.422834 sudo[1925]: pam_unix(sudo:session): session closed for user root Sep 12 22:53:01.423910 sshd[1924]: Connection closed by 147.75.109.163 port 46220 Sep 12 22:53:01.424255 sshd-session[1921]: pam_unix(sshd:session): session closed for user core Sep 12 22:53:01.430572 systemd[1]: sshd@5-139.178.70.110:22-147.75.109.163:46220.service: Deactivated successfully. Sep 12 22:53:01.432385 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 22:53:01.433673 systemd-logind[1620]: Session 8 logged out. Waiting for processes to exit. Sep 12 22:53:01.434630 systemd-logind[1620]: Removed session 8. Sep 12 22:53:01.435814 systemd[1]: Started sshd@6-139.178.70.110:22-147.75.109.163:46236.service - OpenSSH per-connection server daemon (147.75.109.163:46236). Sep 12 22:53:01.475352 sshd[1957]: Accepted publickey for core from 147.75.109.163 port 46236 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:53:01.476149 sshd-session[1957]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:53:01.479196 systemd-logind[1620]: New session 9 of user core. Sep 12 22:53:01.489201 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 22:53:01.540463 sudo[1961]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 22:53:01.540664 sudo[1961]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:53:02.195983 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 22:53:02.201288 (dockerd)[1978]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 22:53:02.681998 dockerd[1978]: time="2025-09-12T22:53:02.681954021Z" level=info msg="Starting up" Sep 12 22:53:02.682505 dockerd[1978]: time="2025-09-12T22:53:02.682484358Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 22:53:02.690975 dockerd[1978]: time="2025-09-12T22:53:02.690944634Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 22:53:02.767137 dockerd[1978]: time="2025-09-12T22:53:02.767105745Z" level=info msg="Loading containers: start." Sep 12 22:53:02.797052 kernel: Initializing XFRM netlink socket Sep 12 22:53:03.135771 systemd-networkd[1321]: docker0: Link UP Sep 12 22:53:03.137332 dockerd[1978]: time="2025-09-12T22:53:03.137309532Z" level=info msg="Loading containers: done." Sep 12 22:53:03.146005 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3556013507-merged.mount: Deactivated successfully. Sep 12 22:53:03.146925 dockerd[1978]: time="2025-09-12T22:53:03.146884967Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 22:53:03.146968 dockerd[1978]: time="2025-09-12T22:53:03.146942063Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 22:53:03.147059 dockerd[1978]: time="2025-09-12T22:53:03.146988952Z" level=info msg="Initializing buildkit" Sep 12 22:53:03.157935 dockerd[1978]: time="2025-09-12T22:53:03.157904790Z" level=info msg="Completed buildkit initialization" Sep 12 22:53:03.162060 dockerd[1978]: time="2025-09-12T22:53:03.162015383Z" level=info msg="Daemon has completed initialization" Sep 12 22:53:03.162144 dockerd[1978]: time="2025-09-12T22:53:03.162058962Z" level=info msg="API listen on /run/docker.sock" Sep 12 22:53:03.162603 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 22:53:04.047919 containerd[1640]: time="2025-09-12T22:53:04.047884429Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 22:53:04.454522 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 22:53:04.456584 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:53:04.779802 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:53:04.785238 (kubelet)[2194]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:53:04.807265 kubelet[2194]: E0912 22:53:04.807229 2194 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:53:04.808629 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:53:04.808774 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:53:04.809160 systemd[1]: kubelet.service: Consumed 99ms CPU time, 110.1M memory peak. Sep 12 22:53:05.052240 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4172487027.mount: Deactivated successfully. Sep 12 22:53:05.887989 containerd[1640]: time="2025-09-12T22:53:05.887912818Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:05.888349 containerd[1640]: time="2025-09-12T22:53:05.888334183Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117124" Sep 12 22:53:05.888793 containerd[1640]: time="2025-09-12T22:53:05.888780146Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:05.890796 containerd[1640]: time="2025-09-12T22:53:05.890195701Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:05.890796 containerd[1640]: time="2025-09-12T22:53:05.890721259Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 1.842538108s" Sep 12 22:53:05.890796 containerd[1640]: time="2025-09-12T22:53:05.890738127Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 12 22:53:05.891246 containerd[1640]: time="2025-09-12T22:53:05.891233065Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 22:53:07.132068 containerd[1640]: time="2025-09-12T22:53:07.131889116Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:07.138259 containerd[1640]: time="2025-09-12T22:53:07.138230344Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716632" Sep 12 22:53:07.145944 containerd[1640]: time="2025-09-12T22:53:07.145912855Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:07.159054 containerd[1640]: time="2025-09-12T22:53:07.158342866Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:07.159054 containerd[1640]: time="2025-09-12T22:53:07.159009879Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 1.267707843s" Sep 12 22:53:07.159054 containerd[1640]: time="2025-09-12T22:53:07.159027594Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 12 22:53:07.159653 containerd[1640]: time="2025-09-12T22:53:07.159639291Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 22:53:08.291357 containerd[1640]: time="2025-09-12T22:53:08.290752499Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:08.291357 containerd[1640]: time="2025-09-12T22:53:08.291172661Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787698" Sep 12 22:53:08.291357 containerd[1640]: time="2025-09-12T22:53:08.291329120Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:08.292761 containerd[1640]: time="2025-09-12T22:53:08.292745964Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:08.293331 containerd[1640]: time="2025-09-12T22:53:08.293318526Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 1.133601684s" Sep 12 22:53:08.293374 containerd[1640]: time="2025-09-12T22:53:08.293367405Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 12 22:53:08.293664 containerd[1640]: time="2025-09-12T22:53:08.293645159Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 22:53:09.232715 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1169379624.mount: Deactivated successfully. Sep 12 22:53:09.674781 containerd[1640]: time="2025-09-12T22:53:09.674271804Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:09.679077 containerd[1640]: time="2025-09-12T22:53:09.679057420Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410252" Sep 12 22:53:09.686657 containerd[1640]: time="2025-09-12T22:53:09.686634275Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:09.691249 containerd[1640]: time="2025-09-12T22:53:09.691227363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:09.691800 containerd[1640]: time="2025-09-12T22:53:09.691769188Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 1.398107538s" Sep 12 22:53:09.691938 containerd[1640]: time="2025-09-12T22:53:09.691867513Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 12 22:53:09.692265 containerd[1640]: time="2025-09-12T22:53:09.692234670Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 22:53:10.117495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4197978459.mount: Deactivated successfully. Sep 12 22:53:10.826070 containerd[1640]: time="2025-09-12T22:53:10.825689199Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:10.828745 containerd[1640]: time="2025-09-12T22:53:10.828720617Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 12 22:53:10.834274 containerd[1640]: time="2025-09-12T22:53:10.834247124Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:10.842865 containerd[1640]: time="2025-09-12T22:53:10.842837683Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:10.843931 containerd[1640]: time="2025-09-12T22:53:10.843793130Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.151474724s" Sep 12 22:53:10.843931 containerd[1640]: time="2025-09-12T22:53:10.843816671Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 22:53:10.844308 containerd[1640]: time="2025-09-12T22:53:10.844296230Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 22:53:11.370631 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1251147495.mount: Deactivated successfully. Sep 12 22:53:11.373049 containerd[1640]: time="2025-09-12T22:53:11.372611372Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:53:11.373049 containerd[1640]: time="2025-09-12T22:53:11.372973816Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 22:53:11.373049 containerd[1640]: time="2025-09-12T22:53:11.373024183Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:53:11.374029 containerd[1640]: time="2025-09-12T22:53:11.374017088Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:53:11.374440 containerd[1640]: time="2025-09-12T22:53:11.374425374Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 530.066861ms" Sep 12 22:53:11.374468 containerd[1640]: time="2025-09-12T22:53:11.374442540Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 22:53:11.374803 containerd[1640]: time="2025-09-12T22:53:11.374756717Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 22:53:11.927365 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2841313371.mount: Deactivated successfully. Sep 12 22:53:13.400935 containerd[1640]: time="2025-09-12T22:53:13.400903817Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:13.401713 containerd[1640]: time="2025-09-12T22:53:13.401697555Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 12 22:53:13.402011 containerd[1640]: time="2025-09-12T22:53:13.401996104Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:13.406367 containerd[1640]: time="2025-09-12T22:53:13.406341627Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:13.407208 containerd[1640]: time="2025-09-12T22:53:13.407127697Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.032320743s" Sep 12 22:53:13.407208 containerd[1640]: time="2025-09-12T22:53:13.407146606Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 12 22:53:14.954348 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 12 22:53:14.957145 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:53:15.206110 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:53:15.207576 (kubelet)[2411]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:53:15.258563 kubelet[2411]: E0912 22:53:15.258536 2411 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:53:15.260216 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:53:15.260299 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:53:15.260492 systemd[1]: kubelet.service: Consumed 91ms CPU time, 108.3M memory peak. Sep 12 22:53:15.345396 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:53:15.345504 systemd[1]: kubelet.service: Consumed 91ms CPU time, 108.3M memory peak. Sep 12 22:53:15.347098 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:53:15.362803 systemd[1]: Reload requested from client PID 2426 ('systemctl') (unit session-9.scope)... Sep 12 22:53:15.362875 systemd[1]: Reloading... Sep 12 22:53:15.420062 zram_generator::config[2469]: No configuration found. Sep 12 22:53:15.495995 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 22:53:15.563295 systemd[1]: Reloading finished in 200 ms. Sep 12 22:53:15.578182 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 22:53:15.578232 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 22:53:15.578424 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:53:15.579550 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:53:15.759336 update_engine[1621]: I20250912 22:53:15.759032 1621 update_attempter.cc:509] Updating boot flags... Sep 12 22:53:16.100580 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:53:16.104499 (kubelet)[2553]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:53:16.147590 kubelet[2553]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:53:16.148885 kubelet[2553]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 22:53:16.148885 kubelet[2553]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:53:16.152605 kubelet[2553]: I0912 22:53:16.151205 2553 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:53:16.508467 kubelet[2553]: I0912 22:53:16.508445 2553 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 22:53:16.508467 kubelet[2553]: I0912 22:53:16.508464 2553 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:53:16.508614 kubelet[2553]: I0912 22:53:16.508603 2553 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 22:53:16.534063 kubelet[2553]: I0912 22:53:16.533749 2553 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:53:16.534491 kubelet[2553]: E0912 22:53:16.534470 2553 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.110:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:16.542689 kubelet[2553]: I0912 22:53:16.542587 2553 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:53:16.547025 kubelet[2553]: I0912 22:53:16.547008 2553 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:53:16.548400 kubelet[2553]: I0912 22:53:16.548385 2553 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 22:53:16.548485 kubelet[2553]: I0912 22:53:16.548463 2553 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:53:16.548593 kubelet[2553]: I0912 22:53:16.548484 2553 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:53:16.548661 kubelet[2553]: I0912 22:53:16.548600 2553 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:53:16.548661 kubelet[2553]: I0912 22:53:16.548608 2553 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 22:53:16.548695 kubelet[2553]: I0912 22:53:16.548667 2553 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:53:16.550886 kubelet[2553]: I0912 22:53:16.550871 2553 kubelet.go:408] "Attempting to sync node with API server" Sep 12 22:53:16.550886 kubelet[2553]: I0912 22:53:16.550885 2553 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:53:16.550935 kubelet[2553]: I0912 22:53:16.550915 2553 kubelet.go:314] "Adding apiserver pod source" Sep 12 22:53:16.550960 kubelet[2553]: I0912 22:53:16.550933 2553 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:53:16.556713 kubelet[2553]: W0912 22:53:16.556686 2553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.110:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Sep 12 22:53:16.556747 kubelet[2553]: E0912 22:53:16.556721 2553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.110:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:16.556773 kubelet[2553]: I0912 22:53:16.556762 2553 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:53:16.556898 kubelet[2553]: W0912 22:53:16.556865 2553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.110:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Sep 12 22:53:16.556945 kubelet[2553]: E0912 22:53:16.556936 2553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.110:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:16.559096 kubelet[2553]: I0912 22:53:16.559086 2553 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:53:16.559124 kubelet[2553]: W0912 22:53:16.559120 2553 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 22:53:16.559427 kubelet[2553]: I0912 22:53:16.559415 2553 server.go:1274] "Started kubelet" Sep 12 22:53:16.559929 kubelet[2553]: I0912 22:53:16.559916 2553 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:53:16.560975 kubelet[2553]: I0912 22:53:16.560525 2553 server.go:449] "Adding debug handlers to kubelet server" Sep 12 22:53:16.562272 kubelet[2553]: I0912 22:53:16.561785 2553 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:53:16.562272 kubelet[2553]: I0912 22:53:16.561908 2553 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:53:16.563057 kubelet[2553]: I0912 22:53:16.563050 2553 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:53:16.564023 kubelet[2553]: E0912 22:53:16.562008 2553 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.110:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.110:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864aad75e606bd0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 22:53:16.559403984 +0000 UTC m=+0.452501458,LastTimestamp:2025-09-12 22:53:16.559403984 +0000 UTC m=+0.452501458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 22:53:16.564297 kubelet[2553]: I0912 22:53:16.564282 2553 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:53:16.568453 kubelet[2553]: E0912 22:53:16.568431 2553 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:53:16.568453 kubelet[2553]: I0912 22:53:16.568453 2553 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 22:53:16.571852 kubelet[2553]: I0912 22:53:16.571841 2553 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 22:53:16.571887 kubelet[2553]: I0912 22:53:16.571873 2553 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:53:16.572069 kubelet[2553]: W0912 22:53:16.572046 2553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.110:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Sep 12 22:53:16.572101 kubelet[2553]: E0912 22:53:16.572074 2553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.110:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:16.572128 kubelet[2553]: E0912 22:53:16.572105 2553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.110:6443: connect: connection refused" interval="200ms" Sep 12 22:53:16.574142 kubelet[2553]: E0912 22:53:16.574133 2553 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:53:16.574985 kubelet[2553]: I0912 22:53:16.574975 2553 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:53:16.574985 kubelet[2553]: I0912 22:53:16.574983 2553 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:53:16.575079 kubelet[2553]: I0912 22:53:16.575060 2553 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:53:16.585051 kubelet[2553]: I0912 22:53:16.584675 2553 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:53:16.588208 kubelet[2553]: I0912 22:53:16.588197 2553 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:53:16.588281 kubelet[2553]: I0912 22:53:16.588275 2553 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 22:53:16.588327 kubelet[2553]: I0912 22:53:16.588315 2553 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 22:53:16.588375 kubelet[2553]: E0912 22:53:16.588367 2553 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:53:16.589326 kubelet[2553]: W0912 22:53:16.589309 2553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.110:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Sep 12 22:53:16.589360 kubelet[2553]: E0912 22:53:16.589331 2553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.110:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:16.589388 kubelet[2553]: I0912 22:53:16.589378 2553 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 22:53:16.589407 kubelet[2553]: I0912 22:53:16.589386 2553 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 22:53:16.589407 kubelet[2553]: I0912 22:53:16.589396 2553 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:53:16.590287 kubelet[2553]: I0912 22:53:16.590275 2553 policy_none.go:49] "None policy: Start" Sep 12 22:53:16.590690 kubelet[2553]: I0912 22:53:16.590682 2553 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 22:53:16.590732 kubelet[2553]: I0912 22:53:16.590728 2553 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:53:16.595627 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 22:53:16.607628 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 22:53:16.610255 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 22:53:16.623828 kubelet[2553]: I0912 22:53:16.623766 2553 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:53:16.623999 kubelet[2553]: I0912 22:53:16.623991 2553 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:53:16.624132 kubelet[2553]: I0912 22:53:16.624058 2553 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:53:16.624304 kubelet[2553]: I0912 22:53:16.624291 2553 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:53:16.626035 kubelet[2553]: E0912 22:53:16.626016 2553 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 22:53:16.696727 systemd[1]: Created slice kubepods-burstable-pod71d8bf7bd9b7c7432927bee9d50592b5.slice - libcontainer container kubepods-burstable-pod71d8bf7bd9b7c7432927bee9d50592b5.slice. Sep 12 22:53:16.716184 systemd[1]: Created slice kubepods-burstable-podfe5e332fba00ba0b5b33a25fe2e8fd7b.slice - libcontainer container kubepods-burstable-podfe5e332fba00ba0b5b33a25fe2e8fd7b.slice. Sep 12 22:53:16.719910 systemd[1]: Created slice kubepods-burstable-pode497fe38fdb9966078fcdebf768a8326.slice - libcontainer container kubepods-burstable-pode497fe38fdb9966078fcdebf768a8326.slice. Sep 12 22:53:16.725264 kubelet[2553]: I0912 22:53:16.725215 2553 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:53:16.725581 kubelet[2553]: E0912 22:53:16.725558 2553 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.110:6443/api/v1/nodes\": dial tcp 139.178.70.110:6443: connect: connection refused" node="localhost" Sep 12 22:53:16.773270 kubelet[2553]: I0912 22:53:16.773052 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:16.773270 kubelet[2553]: I0912 22:53:16.773078 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:16.773270 kubelet[2553]: I0912 22:53:16.773091 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:16.773270 kubelet[2553]: I0912 22:53:16.773102 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:16.773270 kubelet[2553]: I0912 22:53:16.773117 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fe5e332fba00ba0b5b33a25fe2e8fd7b-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"fe5e332fba00ba0b5b33a25fe2e8fd7b\") " pod="kube-system/kube-scheduler-localhost" Sep 12 22:53:16.773416 kubelet[2553]: I0912 22:53:16.773129 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e497fe38fdb9966078fcdebf768a8326-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"e497fe38fdb9966078fcdebf768a8326\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:53:16.773416 kubelet[2553]: I0912 22:53:16.773140 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e497fe38fdb9966078fcdebf768a8326-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"e497fe38fdb9966078fcdebf768a8326\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:53:16.773416 kubelet[2553]: I0912 22:53:16.773151 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e497fe38fdb9966078fcdebf768a8326-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"e497fe38fdb9966078fcdebf768a8326\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:53:16.773416 kubelet[2553]: I0912 22:53:16.773162 2553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:16.774593 kubelet[2553]: E0912 22:53:16.774533 2553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.110:6443: connect: connection refused" interval="400ms" Sep 12 22:53:16.927261 kubelet[2553]: I0912 22:53:16.927236 2553 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:53:16.927550 kubelet[2553]: E0912 22:53:16.927524 2553 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.110:6443/api/v1/nodes\": dial tcp 139.178.70.110:6443: connect: connection refused" node="localhost" Sep 12 22:53:17.014070 containerd[1640]: time="2025-09-12T22:53:17.013766656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:71d8bf7bd9b7c7432927bee9d50592b5,Namespace:kube-system,Attempt:0,}" Sep 12 22:53:17.025088 containerd[1640]: time="2025-09-12T22:53:17.024764387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:fe5e332fba00ba0b5b33a25fe2e8fd7b,Namespace:kube-system,Attempt:0,}" Sep 12 22:53:17.025484 containerd[1640]: time="2025-09-12T22:53:17.025417630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:e497fe38fdb9966078fcdebf768a8326,Namespace:kube-system,Attempt:0,}" Sep 12 22:53:17.139395 containerd[1640]: time="2025-09-12T22:53:17.139330479Z" level=info msg="connecting to shim 11a493578a13d64e7c30b45aec5ead872b3bbc23cdb0864bd059596bc474be03" address="unix:///run/containerd/s/1cad4c1e5f8714a8875716a10c149ba8db6edfa1bbf94be2073cfbf4971ee5d0" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:53:17.140047 containerd[1640]: time="2025-09-12T22:53:17.139726288Z" level=info msg="connecting to shim bc808d9f309c0af50b916736783988db475a60d0a2dc3be0a8f00828ffbf5386" address="unix:///run/containerd/s/d69dac20fa0ac84c1ba8fa9a48c3bcb3a1b62425788c86fbacef6c7a5cb14f1d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:53:17.150319 containerd[1640]: time="2025-09-12T22:53:17.150289098Z" level=info msg="connecting to shim 7542c7b406d8aa936ba0f2de9c68bf904a57263bc57fa8d47ff5fc0513b1a28b" address="unix:///run/containerd/s/b37c9fc8a8ab5b71e3bc31e1fd513d93842d018a5f7c8bc1e012d12cb1ed8bb6" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:53:17.175419 kubelet[2553]: E0912 22:53:17.175389 2553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.110:6443: connect: connection refused" interval="800ms" Sep 12 22:53:17.306220 systemd[1]: Started cri-containerd-7542c7b406d8aa936ba0f2de9c68bf904a57263bc57fa8d47ff5fc0513b1a28b.scope - libcontainer container 7542c7b406d8aa936ba0f2de9c68bf904a57263bc57fa8d47ff5fc0513b1a28b. Sep 12 22:53:17.309823 systemd[1]: Started cri-containerd-11a493578a13d64e7c30b45aec5ead872b3bbc23cdb0864bd059596bc474be03.scope - libcontainer container 11a493578a13d64e7c30b45aec5ead872b3bbc23cdb0864bd059596bc474be03. Sep 12 22:53:17.311795 systemd[1]: Started cri-containerd-bc808d9f309c0af50b916736783988db475a60d0a2dc3be0a8f00828ffbf5386.scope - libcontainer container bc808d9f309c0af50b916736783988db475a60d0a2dc3be0a8f00828ffbf5386. Sep 12 22:53:17.329045 kubelet[2553]: I0912 22:53:17.329027 2553 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:53:17.329251 kubelet[2553]: E0912 22:53:17.329231 2553 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.110:6443/api/v1/nodes\": dial tcp 139.178.70.110:6443: connect: connection refused" node="localhost" Sep 12 22:53:17.381895 containerd[1640]: time="2025-09-12T22:53:17.381811064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:e497fe38fdb9966078fcdebf768a8326,Namespace:kube-system,Attempt:0,} returns sandbox id \"7542c7b406d8aa936ba0f2de9c68bf904a57263bc57fa8d47ff5fc0513b1a28b\"" Sep 12 22:53:17.383520 containerd[1640]: time="2025-09-12T22:53:17.383460317Z" level=info msg="CreateContainer within sandbox \"7542c7b406d8aa936ba0f2de9c68bf904a57263bc57fa8d47ff5fc0513b1a28b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 22:53:17.385363 containerd[1640]: time="2025-09-12T22:53:17.385345610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:71d8bf7bd9b7c7432927bee9d50592b5,Namespace:kube-system,Attempt:0,} returns sandbox id \"11a493578a13d64e7c30b45aec5ead872b3bbc23cdb0864bd059596bc474be03\"" Sep 12 22:53:17.386339 containerd[1640]: time="2025-09-12T22:53:17.386327302Z" level=info msg="CreateContainer within sandbox \"11a493578a13d64e7c30b45aec5ead872b3bbc23cdb0864bd059596bc474be03\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 22:53:17.404173 containerd[1640]: time="2025-09-12T22:53:17.404142837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:fe5e332fba00ba0b5b33a25fe2e8fd7b,Namespace:kube-system,Attempt:0,} returns sandbox id \"bc808d9f309c0af50b916736783988db475a60d0a2dc3be0a8f00828ffbf5386\"" Sep 12 22:53:17.405198 containerd[1640]: time="2025-09-12T22:53:17.405182902Z" level=info msg="CreateContainer within sandbox \"bc808d9f309c0af50b916736783988db475a60d0a2dc3be0a8f00828ffbf5386\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 22:53:17.487676 containerd[1640]: time="2025-09-12T22:53:17.487552926Z" level=info msg="Container 8cec38a8da6258dcd66d8541f7b27b836bf0fa7eed9705de2eb4564fcf63542f: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:17.489735 containerd[1640]: time="2025-09-12T22:53:17.489416195Z" level=info msg="Container 5a6f88996062f24264cc8f0bfccb3d5a9f172fa999f66cc5822afac5eb913163: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:17.490261 containerd[1640]: time="2025-09-12T22:53:17.490246722Z" level=info msg="Container 35741667f9250c20e2dd320e1052f88469909e68ba3bd24fff658902dc4e31b0: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:17.498901 containerd[1640]: time="2025-09-12T22:53:17.498878392Z" level=info msg="CreateContainer within sandbox \"7542c7b406d8aa936ba0f2de9c68bf904a57263bc57fa8d47ff5fc0513b1a28b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"8cec38a8da6258dcd66d8541f7b27b836bf0fa7eed9705de2eb4564fcf63542f\"" Sep 12 22:53:17.499529 containerd[1640]: time="2025-09-12T22:53:17.499513374Z" level=info msg="CreateContainer within sandbox \"11a493578a13d64e7c30b45aec5ead872b3bbc23cdb0864bd059596bc474be03\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"35741667f9250c20e2dd320e1052f88469909e68ba3bd24fff658902dc4e31b0\"" Sep 12 22:53:17.499641 containerd[1640]: time="2025-09-12T22:53:17.499565703Z" level=info msg="CreateContainer within sandbox \"bc808d9f309c0af50b916736783988db475a60d0a2dc3be0a8f00828ffbf5386\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5a6f88996062f24264cc8f0bfccb3d5a9f172fa999f66cc5822afac5eb913163\"" Sep 12 22:53:17.499941 containerd[1640]: time="2025-09-12T22:53:17.499927877Z" level=info msg="StartContainer for \"8cec38a8da6258dcd66d8541f7b27b836bf0fa7eed9705de2eb4564fcf63542f\"" Sep 12 22:53:17.500065 containerd[1640]: time="2025-09-12T22:53:17.500030032Z" level=info msg="StartContainer for \"5a6f88996062f24264cc8f0bfccb3d5a9f172fa999f66cc5822afac5eb913163\"" Sep 12 22:53:17.500191 containerd[1640]: time="2025-09-12T22:53:17.499929222Z" level=info msg="StartContainer for \"35741667f9250c20e2dd320e1052f88469909e68ba3bd24fff658902dc4e31b0\"" Sep 12 22:53:17.500788 containerd[1640]: time="2025-09-12T22:53:17.500766103Z" level=info msg="connecting to shim 5a6f88996062f24264cc8f0bfccb3d5a9f172fa999f66cc5822afac5eb913163" address="unix:///run/containerd/s/d69dac20fa0ac84c1ba8fa9a48c3bcb3a1b62425788c86fbacef6c7a5cb14f1d" protocol=ttrpc version=3 Sep 12 22:53:17.501029 containerd[1640]: time="2025-09-12T22:53:17.500995261Z" level=info msg="connecting to shim 35741667f9250c20e2dd320e1052f88469909e68ba3bd24fff658902dc4e31b0" address="unix:///run/containerd/s/1cad4c1e5f8714a8875716a10c149ba8db6edfa1bbf94be2073cfbf4971ee5d0" protocol=ttrpc version=3 Sep 12 22:53:17.501826 containerd[1640]: time="2025-09-12T22:53:17.501805577Z" level=info msg="connecting to shim 8cec38a8da6258dcd66d8541f7b27b836bf0fa7eed9705de2eb4564fcf63542f" address="unix:///run/containerd/s/b37c9fc8a8ab5b71e3bc31e1fd513d93842d018a5f7c8bc1e012d12cb1ed8bb6" protocol=ttrpc version=3 Sep 12 22:53:17.522333 systemd[1]: Started cri-containerd-8cec38a8da6258dcd66d8541f7b27b836bf0fa7eed9705de2eb4564fcf63542f.scope - libcontainer container 8cec38a8da6258dcd66d8541f7b27b836bf0fa7eed9705de2eb4564fcf63542f. Sep 12 22:53:17.527167 systemd[1]: Started cri-containerd-35741667f9250c20e2dd320e1052f88469909e68ba3bd24fff658902dc4e31b0.scope - libcontainer container 35741667f9250c20e2dd320e1052f88469909e68ba3bd24fff658902dc4e31b0. Sep 12 22:53:17.529600 systemd[1]: Started cri-containerd-5a6f88996062f24264cc8f0bfccb3d5a9f172fa999f66cc5822afac5eb913163.scope - libcontainer container 5a6f88996062f24264cc8f0bfccb3d5a9f172fa999f66cc5822afac5eb913163. Sep 12 22:53:17.570683 kubelet[2553]: W0912 22:53:17.570611 2553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.110:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Sep 12 22:53:17.570683 kubelet[2553]: E0912 22:53:17.570651 2553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.110:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:17.577893 containerd[1640]: time="2025-09-12T22:53:17.577870317Z" level=info msg="StartContainer for \"8cec38a8da6258dcd66d8541f7b27b836bf0fa7eed9705de2eb4564fcf63542f\" returns successfully" Sep 12 22:53:17.583176 containerd[1640]: time="2025-09-12T22:53:17.583155515Z" level=info msg="StartContainer for \"35741667f9250c20e2dd320e1052f88469909e68ba3bd24fff658902dc4e31b0\" returns successfully" Sep 12 22:53:17.606305 containerd[1640]: time="2025-09-12T22:53:17.606281787Z" level=info msg="StartContainer for \"5a6f88996062f24264cc8f0bfccb3d5a9f172fa999f66cc5822afac5eb913163\" returns successfully" Sep 12 22:53:17.687686 kubelet[2553]: W0912 22:53:17.687648 2553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.110:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Sep 12 22:53:17.687774 kubelet[2553]: E0912 22:53:17.687691 2553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.110:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:17.748334 kubelet[2553]: W0912 22:53:17.748295 2553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.110:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Sep 12 22:53:17.748424 kubelet[2553]: E0912 22:53:17.748341 2553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.110:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:17.751727 kubelet[2553]: W0912 22:53:17.751705 2553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.110:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Sep 12 22:53:17.751767 kubelet[2553]: E0912 22:53:17.751732 2553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.110:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.110:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:53:17.975688 kubelet[2553]: E0912 22:53:17.975661 2553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.110:6443: connect: connection refused" interval="1.6s" Sep 12 22:53:18.130273 kubelet[2553]: I0912 22:53:18.130098 2553 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:53:18.130273 kubelet[2553]: E0912 22:53:18.130251 2553 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.110:6443/api/v1/nodes\": dial tcp 139.178.70.110:6443: connect: connection refused" node="localhost" Sep 12 22:53:19.286632 kubelet[2553]: E0912 22:53:19.286606 2553 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Sep 12 22:53:19.557881 kubelet[2553]: I0912 22:53:19.557792 2553 apiserver.go:52] "Watching apiserver" Sep 12 22:53:19.572519 kubelet[2553]: I0912 22:53:19.572483 2553 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 22:53:19.578688 kubelet[2553]: E0912 22:53:19.578663 2553 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 22:53:19.648335 kubelet[2553]: E0912 22:53:19.648311 2553 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Sep 12 22:53:19.731917 kubelet[2553]: I0912 22:53:19.731878 2553 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:53:19.737562 kubelet[2553]: I0912 22:53:19.737525 2553 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 12 22:53:20.594250 systemd[1]: Reload requested from client PID 2827 ('systemctl') (unit session-9.scope)... Sep 12 22:53:20.594262 systemd[1]: Reloading... Sep 12 22:53:20.653074 zram_generator::config[2870]: No configuration found. Sep 12 22:53:20.739236 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 22:53:20.821591 systemd[1]: Reloading finished in 226 ms. Sep 12 22:53:20.839662 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:53:20.852566 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 22:53:20.852955 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:53:20.853065 systemd[1]: kubelet.service: Consumed 588ms CPU time, 124.3M memory peak. Sep 12 22:53:20.854933 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:53:21.210393 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:53:21.220370 (kubelet)[2938]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:53:21.324052 kubelet[2938]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:53:21.324052 kubelet[2938]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 22:53:21.324052 kubelet[2938]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:53:21.324052 kubelet[2938]: I0912 22:53:21.323915 2938 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:53:21.337689 kubelet[2938]: I0912 22:53:21.337654 2938 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 22:53:21.337689 kubelet[2938]: I0912 22:53:21.337673 2938 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:53:21.337882 kubelet[2938]: I0912 22:53:21.337867 2938 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 22:53:21.338838 kubelet[2938]: I0912 22:53:21.338817 2938 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 22:53:21.571906 kubelet[2938]: I0912 22:53:21.571824 2938 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:53:21.576084 kubelet[2938]: I0912 22:53:21.576073 2938 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:53:21.578516 kubelet[2938]: I0912 22:53:21.578462 2938 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:53:21.578607 kubelet[2938]: I0912 22:53:21.578599 2938 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 22:53:21.579055 kubelet[2938]: I0912 22:53:21.578751 2938 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:53:21.579055 kubelet[2938]: I0912 22:53:21.578774 2938 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:53:21.579055 kubelet[2938]: I0912 22:53:21.578900 2938 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:53:21.579055 kubelet[2938]: I0912 22:53:21.578907 2938 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 22:53:21.579206 kubelet[2938]: I0912 22:53:21.578927 2938 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:53:21.579206 kubelet[2938]: I0912 22:53:21.578997 2938 kubelet.go:408] "Attempting to sync node with API server" Sep 12 22:53:21.579206 kubelet[2938]: I0912 22:53:21.579007 2938 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:53:21.579206 kubelet[2938]: I0912 22:53:21.579031 2938 kubelet.go:314] "Adding apiserver pod source" Sep 12 22:53:21.579308 kubelet[2938]: I0912 22:53:21.579300 2938 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:53:21.590493 kubelet[2938]: I0912 22:53:21.590473 2938 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:53:21.591406 kubelet[2938]: I0912 22:53:21.590909 2938 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:53:21.591406 kubelet[2938]: I0912 22:53:21.591212 2938 server.go:1274] "Started kubelet" Sep 12 22:53:21.612103 kubelet[2938]: I0912 22:53:21.612077 2938 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:53:21.613333 kubelet[2938]: I0912 22:53:21.613312 2938 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:53:21.614187 kubelet[2938]: I0912 22:53:21.613057 2938 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:53:21.615924 kubelet[2938]: I0912 22:53:21.615912 2938 server.go:449] "Adding debug handlers to kubelet server" Sep 12 22:53:21.616625 kubelet[2938]: I0912 22:53:21.616605 2938 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:53:21.616795 kubelet[2938]: I0912 22:53:21.616785 2938 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:53:21.616995 kubelet[2938]: I0912 22:53:21.616978 2938 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 22:53:21.618330 kubelet[2938]: I0912 22:53:21.618306 2938 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 22:53:21.618408 kubelet[2938]: I0912 22:53:21.618396 2938 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:53:21.620123 kubelet[2938]: I0912 22:53:21.620111 2938 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:53:21.620290 kubelet[2938]: I0912 22:53:21.620178 2938 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:53:21.620452 kubelet[2938]: I0912 22:53:21.620384 2938 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:53:21.620452 kubelet[2938]: E0912 22:53:21.620401 2938 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:53:21.643129 kubelet[2938]: I0912 22:53:21.643074 2938 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:53:21.645315 kubelet[2938]: I0912 22:53:21.645289 2938 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:53:21.645315 kubelet[2938]: I0912 22:53:21.645307 2938 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 22:53:21.645315 kubelet[2938]: I0912 22:53:21.645318 2938 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 22:53:21.645496 kubelet[2938]: E0912 22:53:21.645340 2938 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:53:21.668968 kubelet[2938]: I0912 22:53:21.668949 2938 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 22:53:21.668968 kubelet[2938]: I0912 22:53:21.668960 2938 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 22:53:21.668968 kubelet[2938]: I0912 22:53:21.668972 2938 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:53:21.669096 kubelet[2938]: I0912 22:53:21.669088 2938 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 22:53:21.669240 kubelet[2938]: I0912 22:53:21.669094 2938 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 22:53:21.669240 kubelet[2938]: I0912 22:53:21.669106 2938 policy_none.go:49] "None policy: Start" Sep 12 22:53:21.669485 kubelet[2938]: I0912 22:53:21.669454 2938 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 22:53:21.669676 kubelet[2938]: I0912 22:53:21.669559 2938 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:53:21.669676 kubelet[2938]: I0912 22:53:21.669641 2938 state_mem.go:75] "Updated machine memory state" Sep 12 22:53:21.672319 kubelet[2938]: I0912 22:53:21.672285 2938 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:53:21.672451 kubelet[2938]: I0912 22:53:21.672437 2938 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:53:21.672483 kubelet[2938]: I0912 22:53:21.672447 2938 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:53:21.673022 kubelet[2938]: I0912 22:53:21.672621 2938 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:53:21.790660 kubelet[2938]: I0912 22:53:21.790636 2938 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:53:21.821846 kubelet[2938]: I0912 22:53:21.821787 2938 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 12 22:53:21.822469 kubelet[2938]: I0912 22:53:21.822076 2938 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 12 22:53:21.920788 kubelet[2938]: I0912 22:53:21.920757 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:21.920788 kubelet[2938]: I0912 22:53:21.920782 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:21.920902 kubelet[2938]: I0912 22:53:21.920799 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fe5e332fba00ba0b5b33a25fe2e8fd7b-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"fe5e332fba00ba0b5b33a25fe2e8fd7b\") " pod="kube-system/kube-scheduler-localhost" Sep 12 22:53:21.920902 kubelet[2938]: I0912 22:53:21.920808 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:21.920902 kubelet[2938]: I0912 22:53:21.920819 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:21.920902 kubelet[2938]: I0912 22:53:21.920829 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:53:21.920902 kubelet[2938]: I0912 22:53:21.920838 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e497fe38fdb9966078fcdebf768a8326-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"e497fe38fdb9966078fcdebf768a8326\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:53:21.920985 kubelet[2938]: I0912 22:53:21.920845 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e497fe38fdb9966078fcdebf768a8326-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"e497fe38fdb9966078fcdebf768a8326\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:53:21.920985 kubelet[2938]: I0912 22:53:21.920855 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e497fe38fdb9966078fcdebf768a8326-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"e497fe38fdb9966078fcdebf768a8326\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:53:22.582000 kubelet[2938]: I0912 22:53:22.581855 2938 apiserver.go:52] "Watching apiserver" Sep 12 22:53:22.618849 kubelet[2938]: I0912 22:53:22.618795 2938 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 22:53:22.677383 kubelet[2938]: I0912 22:53:22.677343 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.6773310750000001 podStartE2EDuration="1.677331075s" podCreationTimestamp="2025-09-12 22:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:53:22.672707084 +0000 UTC m=+1.405211660" watchObservedRunningTime="2025-09-12 22:53:22.677331075 +0000 UTC m=+1.409835645" Sep 12 22:53:22.683033 kubelet[2938]: I0912 22:53:22.682690 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.68267842 podStartE2EDuration="1.68267842s" podCreationTimestamp="2025-09-12 22:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:53:22.678287584 +0000 UTC m=+1.410792166" watchObservedRunningTime="2025-09-12 22:53:22.68267842 +0000 UTC m=+1.415182993" Sep 12 22:53:22.683033 kubelet[2938]: I0912 22:53:22.682751 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.6827475600000001 podStartE2EDuration="1.68274756s" podCreationTimestamp="2025-09-12 22:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:53:22.682720487 +0000 UTC m=+1.415225060" watchObservedRunningTime="2025-09-12 22:53:22.68274756 +0000 UTC m=+1.415252141" Sep 12 22:53:26.931623 kubelet[2938]: I0912 22:53:26.931555 2938 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 22:53:26.932067 kubelet[2938]: I0912 22:53:26.931996 2938 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 22:53:26.932103 containerd[1640]: time="2025-09-12T22:53:26.931871950Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 22:53:27.991066 systemd[1]: Created slice kubepods-besteffort-pod79903bf2_68c0_47a3_9a6f_7f17b867f7b6.slice - libcontainer container kubepods-besteffort-pod79903bf2_68c0_47a3_9a6f_7f17b867f7b6.slice. Sep 12 22:53:28.053578 systemd[1]: Created slice kubepods-besteffort-pod6b46909c_c7f7_4e95_ac20_0c323aee95f3.slice - libcontainer container kubepods-besteffort-pod6b46909c_c7f7_4e95_ac20_0c323aee95f3.slice. Sep 12 22:53:28.061327 kubelet[2938]: I0912 22:53:28.061303 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/79903bf2-68c0-47a3-9a6f-7f17b867f7b6-kube-proxy\") pod \"kube-proxy-557v8\" (UID: \"79903bf2-68c0-47a3-9a6f-7f17b867f7b6\") " pod="kube-system/kube-proxy-557v8" Sep 12 22:53:28.061955 kubelet[2938]: I0912 22:53:28.061750 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/79903bf2-68c0-47a3-9a6f-7f17b867f7b6-xtables-lock\") pod \"kube-proxy-557v8\" (UID: \"79903bf2-68c0-47a3-9a6f-7f17b867f7b6\") " pod="kube-system/kube-proxy-557v8" Sep 12 22:53:28.062112 kubelet[2938]: I0912 22:53:28.062021 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79903bf2-68c0-47a3-9a6f-7f17b867f7b6-lib-modules\") pod \"kube-proxy-557v8\" (UID: \"79903bf2-68c0-47a3-9a6f-7f17b867f7b6\") " pod="kube-system/kube-proxy-557v8" Sep 12 22:53:28.062112 kubelet[2938]: I0912 22:53:28.062062 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6b46909c-c7f7-4e95-ac20-0c323aee95f3-var-lib-calico\") pod \"tigera-operator-58fc44c59b-476xv\" (UID: \"6b46909c-c7f7-4e95-ac20-0c323aee95f3\") " pod="tigera-operator/tigera-operator-58fc44c59b-476xv" Sep 12 22:53:28.062112 kubelet[2938]: I0912 22:53:28.062076 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv5q5\" (UniqueName: \"kubernetes.io/projected/6b46909c-c7f7-4e95-ac20-0c323aee95f3-kube-api-access-pv5q5\") pod \"tigera-operator-58fc44c59b-476xv\" (UID: \"6b46909c-c7f7-4e95-ac20-0c323aee95f3\") " pod="tigera-operator/tigera-operator-58fc44c59b-476xv" Sep 12 22:53:28.062112 kubelet[2938]: I0912 22:53:28.062090 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qxzq\" (UniqueName: \"kubernetes.io/projected/79903bf2-68c0-47a3-9a6f-7f17b867f7b6-kube-api-access-7qxzq\") pod \"kube-proxy-557v8\" (UID: \"79903bf2-68c0-47a3-9a6f-7f17b867f7b6\") " pod="kube-system/kube-proxy-557v8" Sep 12 22:53:28.302014 containerd[1640]: time="2025-09-12T22:53:28.301451019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-557v8,Uid:79903bf2-68c0-47a3-9a6f-7f17b867f7b6,Namespace:kube-system,Attempt:0,}" Sep 12 22:53:28.319594 containerd[1640]: time="2025-09-12T22:53:28.319535037Z" level=info msg="connecting to shim 08389cb333f66ee98cc08e1153736172126123ec988d19fe4c7f231c0f0b5523" address="unix:///run/containerd/s/6af9e480c6402d414892c7bb07b4502cd565bff52beb4ba95cc68d5feda22e22" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:53:28.337146 systemd[1]: Started cri-containerd-08389cb333f66ee98cc08e1153736172126123ec988d19fe4c7f231c0f0b5523.scope - libcontainer container 08389cb333f66ee98cc08e1153736172126123ec988d19fe4c7f231c0f0b5523. Sep 12 22:53:28.353291 containerd[1640]: time="2025-09-12T22:53:28.353230896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-557v8,Uid:79903bf2-68c0-47a3-9a6f-7f17b867f7b6,Namespace:kube-system,Attempt:0,} returns sandbox id \"08389cb333f66ee98cc08e1153736172126123ec988d19fe4c7f231c0f0b5523\"" Sep 12 22:53:28.355295 containerd[1640]: time="2025-09-12T22:53:28.355271937Z" level=info msg="CreateContainer within sandbox \"08389cb333f66ee98cc08e1153736172126123ec988d19fe4c7f231c0f0b5523\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 22:53:28.358647 containerd[1640]: time="2025-09-12T22:53:28.358540138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-476xv,Uid:6b46909c-c7f7-4e95-ac20-0c323aee95f3,Namespace:tigera-operator,Attempt:0,}" Sep 12 22:53:28.366156 containerd[1640]: time="2025-09-12T22:53:28.366130044Z" level=info msg="Container bd9c6d789b19c82a289bd0900d0f2ade096a88075d8eead6d53231ca47f9613a: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:28.370713 containerd[1640]: time="2025-09-12T22:53:28.370688504Z" level=info msg="connecting to shim a3f1fe0a608864ba4a056bdc0fe306301bcfeaa6123722a175c546d830be2b4a" address="unix:///run/containerd/s/04e50e489cd50cecadb06fca0760fffab23279341ea21da8886b4bc8795aa08c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:53:28.373020 containerd[1640]: time="2025-09-12T22:53:28.372996917Z" level=info msg="CreateContainer within sandbox \"08389cb333f66ee98cc08e1153736172126123ec988d19fe4c7f231c0f0b5523\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"bd9c6d789b19c82a289bd0900d0f2ade096a88075d8eead6d53231ca47f9613a\"" Sep 12 22:53:28.373978 containerd[1640]: time="2025-09-12T22:53:28.373889214Z" level=info msg="StartContainer for \"bd9c6d789b19c82a289bd0900d0f2ade096a88075d8eead6d53231ca47f9613a\"" Sep 12 22:53:28.376479 containerd[1640]: time="2025-09-12T22:53:28.376456535Z" level=info msg="connecting to shim bd9c6d789b19c82a289bd0900d0f2ade096a88075d8eead6d53231ca47f9613a" address="unix:///run/containerd/s/6af9e480c6402d414892c7bb07b4502cd565bff52beb4ba95cc68d5feda22e22" protocol=ttrpc version=3 Sep 12 22:53:28.394145 systemd[1]: Started cri-containerd-a3f1fe0a608864ba4a056bdc0fe306301bcfeaa6123722a175c546d830be2b4a.scope - libcontainer container a3f1fe0a608864ba4a056bdc0fe306301bcfeaa6123722a175c546d830be2b4a. Sep 12 22:53:28.397722 systemd[1]: Started cri-containerd-bd9c6d789b19c82a289bd0900d0f2ade096a88075d8eead6d53231ca47f9613a.scope - libcontainer container bd9c6d789b19c82a289bd0900d0f2ade096a88075d8eead6d53231ca47f9613a. Sep 12 22:53:28.430452 containerd[1640]: time="2025-09-12T22:53:28.430235003Z" level=info msg="StartContainer for \"bd9c6d789b19c82a289bd0900d0f2ade096a88075d8eead6d53231ca47f9613a\" returns successfully" Sep 12 22:53:28.439431 containerd[1640]: time="2025-09-12T22:53:28.439402480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-476xv,Uid:6b46909c-c7f7-4e95-ac20-0c323aee95f3,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a3f1fe0a608864ba4a056bdc0fe306301bcfeaa6123722a175c546d830be2b4a\"" Sep 12 22:53:28.440271 containerd[1640]: time="2025-09-12T22:53:28.440240402Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 22:53:28.681099 kubelet[2938]: I0912 22:53:28.680861 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-557v8" podStartSLOduration=1.680831892 podStartE2EDuration="1.680831892s" podCreationTimestamp="2025-09-12 22:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:53:28.68079653 +0000 UTC m=+7.413301112" watchObservedRunningTime="2025-09-12 22:53:28.680831892 +0000 UTC m=+7.413336473" Sep 12 22:53:29.170906 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount813706547.mount: Deactivated successfully. Sep 12 22:53:30.244839 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3879993171.mount: Deactivated successfully. Sep 12 22:53:31.597404 containerd[1640]: time="2025-09-12T22:53:31.596938369Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:31.597404 containerd[1640]: time="2025-09-12T22:53:31.597356297Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 22:53:31.597404 containerd[1640]: time="2025-09-12T22:53:31.597378862Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:31.598480 containerd[1640]: time="2025-09-12T22:53:31.598466724Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:31.598880 containerd[1640]: time="2025-09-12T22:53:31.598865263Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.15860813s" Sep 12 22:53:31.598910 containerd[1640]: time="2025-09-12T22:53:31.598883118Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 22:53:31.600135 containerd[1640]: time="2025-09-12T22:53:31.600122270Z" level=info msg="CreateContainer within sandbox \"a3f1fe0a608864ba4a056bdc0fe306301bcfeaa6123722a175c546d830be2b4a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 22:53:31.603518 containerd[1640]: time="2025-09-12T22:53:31.603496971Z" level=info msg="Container 3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:31.613974 containerd[1640]: time="2025-09-12T22:53:31.613954372Z" level=info msg="CreateContainer within sandbox \"a3f1fe0a608864ba4a056bdc0fe306301bcfeaa6123722a175c546d830be2b4a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f\"" Sep 12 22:53:31.614584 containerd[1640]: time="2025-09-12T22:53:31.614557133Z" level=info msg="StartContainer for \"3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f\"" Sep 12 22:53:31.615138 containerd[1640]: time="2025-09-12T22:53:31.615111400Z" level=info msg="connecting to shim 3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f" address="unix:///run/containerd/s/04e50e489cd50cecadb06fca0760fffab23279341ea21da8886b4bc8795aa08c" protocol=ttrpc version=3 Sep 12 22:53:31.634148 systemd[1]: Started cri-containerd-3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f.scope - libcontainer container 3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f. Sep 12 22:53:31.652033 containerd[1640]: time="2025-09-12T22:53:31.651975538Z" level=info msg="StartContainer for \"3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f\" returns successfully" Sep 12 22:53:31.682999 kubelet[2938]: I0912 22:53:31.682961 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-476xv" podStartSLOduration=0.523437699 podStartE2EDuration="3.682950387s" podCreationTimestamp="2025-09-12 22:53:28 +0000 UTC" firstStartedPulling="2025-09-12 22:53:28.439973857 +0000 UTC m=+7.172478428" lastFinishedPulling="2025-09-12 22:53:31.599486543 +0000 UTC m=+10.331991116" observedRunningTime="2025-09-12 22:53:31.682178681 +0000 UTC m=+10.414683263" watchObservedRunningTime="2025-09-12 22:53:31.682950387 +0000 UTC m=+10.415454965" Sep 12 22:53:34.098249 systemd[1]: cri-containerd-3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f.scope: Deactivated successfully. Sep 12 22:53:34.135057 containerd[1640]: time="2025-09-12T22:53:34.133532147Z" level=info msg="received exit event container_id:\"3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f\" id:\"3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f\" pid:3256 exit_status:1 exited_at:{seconds:1757717614 nanos:100036689}" Sep 12 22:53:34.149075 containerd[1640]: time="2025-09-12T22:53:34.147817112Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f\" id:\"3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f\" pid:3256 exit_status:1 exited_at:{seconds:1757717614 nanos:100036689}" Sep 12 22:53:34.169586 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f-rootfs.mount: Deactivated successfully. Sep 12 22:53:34.702059 kubelet[2938]: I0912 22:53:34.701492 2938 scope.go:117] "RemoveContainer" containerID="3773932ee78bfa9e87959ef7e4d7cb014931f5359237f6311e1e0990b0f2d86f" Sep 12 22:53:34.703979 containerd[1640]: time="2025-09-12T22:53:34.703937753Z" level=info msg="CreateContainer within sandbox \"a3f1fe0a608864ba4a056bdc0fe306301bcfeaa6123722a175c546d830be2b4a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 22:53:34.716649 containerd[1640]: time="2025-09-12T22:53:34.716136142Z" level=info msg="Container 90492b34693e9f8cf60b96e7d98a1e93096485715b3226b5c7e0860aae53eaa0: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:34.717575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3861430175.mount: Deactivated successfully. Sep 12 22:53:34.721557 containerd[1640]: time="2025-09-12T22:53:34.721530675Z" level=info msg="CreateContainer within sandbox \"a3f1fe0a608864ba4a056bdc0fe306301bcfeaa6123722a175c546d830be2b4a\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"90492b34693e9f8cf60b96e7d98a1e93096485715b3226b5c7e0860aae53eaa0\"" Sep 12 22:53:34.722596 containerd[1640]: time="2025-09-12T22:53:34.722419066Z" level=info msg="StartContainer for \"90492b34693e9f8cf60b96e7d98a1e93096485715b3226b5c7e0860aae53eaa0\"" Sep 12 22:53:34.723922 containerd[1640]: time="2025-09-12T22:53:34.723898389Z" level=info msg="connecting to shim 90492b34693e9f8cf60b96e7d98a1e93096485715b3226b5c7e0860aae53eaa0" address="unix:///run/containerd/s/04e50e489cd50cecadb06fca0760fffab23279341ea21da8886b4bc8795aa08c" protocol=ttrpc version=3 Sep 12 22:53:34.744181 systemd[1]: Started cri-containerd-90492b34693e9f8cf60b96e7d98a1e93096485715b3226b5c7e0860aae53eaa0.scope - libcontainer container 90492b34693e9f8cf60b96e7d98a1e93096485715b3226b5c7e0860aae53eaa0. Sep 12 22:53:34.768094 containerd[1640]: time="2025-09-12T22:53:34.768033462Z" level=info msg="StartContainer for \"90492b34693e9f8cf60b96e7d98a1e93096485715b3226b5c7e0860aae53eaa0\" returns successfully" Sep 12 22:53:36.830501 sudo[1961]: pam_unix(sudo:session): session closed for user root Sep 12 22:53:36.831411 sshd[1960]: Connection closed by 147.75.109.163 port 46236 Sep 12 22:53:36.832659 sshd-session[1957]: pam_unix(sshd:session): session closed for user core Sep 12 22:53:36.835270 systemd-logind[1620]: Session 9 logged out. Waiting for processes to exit. Sep 12 22:53:36.835806 systemd[1]: sshd@6-139.178.70.110:22-147.75.109.163:46236.service: Deactivated successfully. Sep 12 22:53:36.837348 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 22:53:36.837586 systemd[1]: session-9.scope: Consumed 3.089s CPU time, 151.1M memory peak. Sep 12 22:53:36.839449 systemd-logind[1620]: Removed session 9. Sep 12 22:53:39.982920 systemd[1]: Created slice kubepods-besteffort-pod87516dca_6dc0_42fb_a64c_0458c8ebb5ce.slice - libcontainer container kubepods-besteffort-pod87516dca_6dc0_42fb_a64c_0458c8ebb5ce.slice. Sep 12 22:53:40.039371 kubelet[2938]: I0912 22:53:40.039208 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87516dca-6dc0-42fb-a64c-0458c8ebb5ce-tigera-ca-bundle\") pod \"calico-typha-568c494585-clkmz\" (UID: \"87516dca-6dc0-42fb-a64c-0458c8ebb5ce\") " pod="calico-system/calico-typha-568c494585-clkmz" Sep 12 22:53:40.039371 kubelet[2938]: I0912 22:53:40.039239 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/87516dca-6dc0-42fb-a64c-0458c8ebb5ce-typha-certs\") pod \"calico-typha-568c494585-clkmz\" (UID: \"87516dca-6dc0-42fb-a64c-0458c8ebb5ce\") " pod="calico-system/calico-typha-568c494585-clkmz" Sep 12 22:53:40.039371 kubelet[2938]: I0912 22:53:40.039251 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mp5t\" (UniqueName: \"kubernetes.io/projected/87516dca-6dc0-42fb-a64c-0458c8ebb5ce-kube-api-access-5mp5t\") pod \"calico-typha-568c494585-clkmz\" (UID: \"87516dca-6dc0-42fb-a64c-0458c8ebb5ce\") " pod="calico-system/calico-typha-568c494585-clkmz" Sep 12 22:53:40.133567 kubelet[2938]: W0912 22:53:40.133480 2938 reflector.go:561] object-"calico-system"/"cni-config": failed to list *v1.ConfigMap: configmaps "cni-config" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Sep 12 22:53:40.133567 kubelet[2938]: E0912 22:53:40.133505 2938 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"cni-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-config\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 12 22:53:40.133567 kubelet[2938]: W0912 22:53:40.133510 2938 reflector.go:561] object-"calico-system"/"node-certs": failed to list *v1.Secret: secrets "node-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Sep 12 22:53:40.133567 kubelet[2938]: E0912 22:53:40.133525 2938 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"node-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-certs\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 12 22:53:40.137250 systemd[1]: Created slice kubepods-besteffort-poda1ab6f66_9d67_48b4_a1d3_5d0a253aa8b1.slice - libcontainer container kubepods-besteffort-poda1ab6f66_9d67_48b4_a1d3_5d0a253aa8b1.slice. Sep 12 22:53:40.240519 kubelet[2938]: I0912 22:53:40.240362 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-cni-bin-dir\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.240519 kubelet[2938]: I0912 22:53:40.240399 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-cni-net-dir\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.240519 kubelet[2938]: I0912 22:53:40.240411 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-cni-log-dir\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.240519 kubelet[2938]: I0912 22:53:40.240420 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-tigera-ca-bundle\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.240519 kubelet[2938]: I0912 22:53:40.240431 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-var-run-calico\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.240673 kubelet[2938]: I0912 22:53:40.240440 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-policysync\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.240673 kubelet[2938]: I0912 22:53:40.240449 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-lib-modules\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.241187 kubelet[2938]: I0912 22:53:40.241102 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-flexvol-driver-host\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.241187 kubelet[2938]: I0912 22:53:40.241118 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-xtables-lock\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.241187 kubelet[2938]: I0912 22:53:40.241127 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlvp7\" (UniqueName: \"kubernetes.io/projected/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-kube-api-access-xlvp7\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.241187 kubelet[2938]: I0912 22:53:40.241153 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-node-certs\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.241187 kubelet[2938]: I0912 22:53:40.241164 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1-var-lib-calico\") pod \"calico-node-ps95x\" (UID: \"a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1\") " pod="calico-system/calico-node-ps95x" Sep 12 22:53:40.289840 containerd[1640]: time="2025-09-12T22:53:40.289813406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-568c494585-clkmz,Uid:87516dca-6dc0-42fb-a64c-0458c8ebb5ce,Namespace:calico-system,Attempt:0,}" Sep 12 22:53:40.315148 containerd[1640]: time="2025-09-12T22:53:40.315025857Z" level=info msg="connecting to shim e6c873c15828f4309d667874350a47664d8639661ca53782cf8debc15d465341" address="unix:///run/containerd/s/153cc24d7cb9c13a5fdaf3899c099df5a6edc0f09a92f693b25b28b1e89e44a4" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:53:40.340856 systemd[1]: Started cri-containerd-e6c873c15828f4309d667874350a47664d8639661ca53782cf8debc15d465341.scope - libcontainer container e6c873c15828f4309d667874350a47664d8639661ca53782cf8debc15d465341. Sep 12 22:53:40.364143 kubelet[2938]: E0912 22:53:40.364016 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7nf98" podUID="25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f" Sep 12 22:53:40.376813 kubelet[2938]: E0912 22:53:40.376792 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.376813 kubelet[2938]: W0912 22:53:40.376808 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.376905 kubelet[2938]: E0912 22:53:40.376832 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.431701 containerd[1640]: time="2025-09-12T22:53:40.431679435Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-568c494585-clkmz,Uid:87516dca-6dc0-42fb-a64c-0458c8ebb5ce,Namespace:calico-system,Attempt:0,} returns sandbox id \"e6c873c15828f4309d667874350a47664d8639661ca53782cf8debc15d465341\"" Sep 12 22:53:40.432609 containerd[1640]: time="2025-09-12T22:53:40.432596300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 22:53:40.433579 kubelet[2938]: E0912 22:53:40.433558 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.433625 kubelet[2938]: W0912 22:53:40.433576 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.433625 kubelet[2938]: E0912 22:53:40.433614 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.433844 kubelet[2938]: E0912 22:53:40.433744 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.433844 kubelet[2938]: W0912 22:53:40.433751 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.433844 kubelet[2938]: E0912 22:53:40.433759 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.433922 kubelet[2938]: E0912 22:53:40.433867 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.433922 kubelet[2938]: W0912 22:53:40.433882 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.433922 kubelet[2938]: E0912 22:53:40.433889 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.434257 kubelet[2938]: E0912 22:53:40.433983 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.434257 kubelet[2938]: W0912 22:53:40.433990 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.434257 kubelet[2938]: E0912 22:53:40.433996 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.434257 kubelet[2938]: E0912 22:53:40.434150 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.434257 kubelet[2938]: W0912 22:53:40.434154 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.434257 kubelet[2938]: E0912 22:53:40.434168 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.434548 kubelet[2938]: E0912 22:53:40.434536 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.434548 kubelet[2938]: W0912 22:53:40.434544 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.434610 kubelet[2938]: E0912 22:53:40.434551 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.434821 kubelet[2938]: E0912 22:53:40.434802 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.434821 kubelet[2938]: W0912 22:53:40.434810 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.434821 kubelet[2938]: E0912 22:53:40.434816 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.435023 kubelet[2938]: E0912 22:53:40.434960 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.435023 kubelet[2938]: W0912 22:53:40.434967 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.435023 kubelet[2938]: E0912 22:53:40.434975 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.435234 kubelet[2938]: E0912 22:53:40.435222 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.435234 kubelet[2938]: W0912 22:53:40.435231 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.435299 kubelet[2938]: E0912 22:53:40.435238 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.435440 kubelet[2938]: E0912 22:53:40.435427 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.435477 kubelet[2938]: W0912 22:53:40.435441 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.435792 kubelet[2938]: E0912 22:53:40.435452 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.435917 kubelet[2938]: E0912 22:53:40.435906 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.435917 kubelet[2938]: W0912 22:53:40.435915 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.435984 kubelet[2938]: E0912 22:53:40.435922 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.436161 kubelet[2938]: E0912 22:53:40.436149 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.436161 kubelet[2938]: W0912 22:53:40.436158 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.436374 kubelet[2938]: E0912 22:53:40.436164 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.436374 kubelet[2938]: E0912 22:53:40.436286 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.436374 kubelet[2938]: W0912 22:53:40.436300 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.436374 kubelet[2938]: E0912 22:53:40.436309 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.436449 kubelet[2938]: E0912 22:53:40.436421 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.436449 kubelet[2938]: W0912 22:53:40.436427 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.436449 kubelet[2938]: E0912 22:53:40.436432 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.436532 kubelet[2938]: E0912 22:53:40.436521 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.436532 kubelet[2938]: W0912 22:53:40.436528 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.436532 kubelet[2938]: E0912 22:53:40.436532 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.436628 kubelet[2938]: E0912 22:53:40.436618 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.436628 kubelet[2938]: W0912 22:53:40.436624 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.436628 kubelet[2938]: E0912 22:53:40.436629 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.437276 kubelet[2938]: E0912 22:53:40.437259 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.437417 kubelet[2938]: W0912 22:53:40.437292 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.437417 kubelet[2938]: E0912 22:53:40.437304 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.437618 kubelet[2938]: E0912 22:53:40.437563 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.437618 kubelet[2938]: W0912 22:53:40.437573 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.437618 kubelet[2938]: E0912 22:53:40.437581 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.438010 kubelet[2938]: E0912 22:53:40.437996 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.438010 kubelet[2938]: W0912 22:53:40.438004 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.438177 kubelet[2938]: E0912 22:53:40.438012 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.438385 kubelet[2938]: E0912 22:53:40.438366 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.438385 kubelet[2938]: W0912 22:53:40.438376 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.438430 kubelet[2938]: E0912 22:53:40.438397 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.442892 kubelet[2938]: E0912 22:53:40.442875 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.443147 kubelet[2938]: W0912 22:53:40.443075 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.443147 kubelet[2938]: E0912 22:53:40.443092 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.443410 kubelet[2938]: E0912 22:53:40.443402 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.443512 kubelet[2938]: W0912 22:53:40.443443 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.443512 kubelet[2938]: E0912 22:53:40.443452 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.443512 kubelet[2938]: I0912 22:53:40.443468 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f-socket-dir\") pod \"csi-node-driver-7nf98\" (UID: \"25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f\") " pod="calico-system/csi-node-driver-7nf98" Sep 12 22:53:40.443595 kubelet[2938]: E0912 22:53:40.443589 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.443692 kubelet[2938]: W0912 22:53:40.443625 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.443692 kubelet[2938]: E0912 22:53:40.443638 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.443692 kubelet[2938]: I0912 22:53:40.443647 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgtmc\" (UniqueName: \"kubernetes.io/projected/25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f-kube-api-access-lgtmc\") pod \"csi-node-driver-7nf98\" (UID: \"25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f\") " pod="calico-system/csi-node-driver-7nf98" Sep 12 22:53:40.443771 kubelet[2938]: E0912 22:53:40.443765 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.443803 kubelet[2938]: W0912 22:53:40.443798 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.443889 kubelet[2938]: E0912 22:53:40.443834 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.443889 kubelet[2938]: I0912 22:53:40.443845 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f-registration-dir\") pod \"csi-node-driver-7nf98\" (UID: \"25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f\") " pod="calico-system/csi-node-driver-7nf98" Sep 12 22:53:40.443999 kubelet[2938]: E0912 22:53:40.443986 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444054 kubelet[2938]: W0912 22:53:40.444034 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444089 kubelet[2938]: E0912 22:53:40.444084 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.444145 kubelet[2938]: I0912 22:53:40.444129 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f-varrun\") pod \"csi-node-driver-7nf98\" (UID: \"25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f\") " pod="calico-system/csi-node-driver-7nf98" Sep 12 22:53:40.444192 kubelet[2938]: E0912 22:53:40.444179 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444192 kubelet[2938]: W0912 22:53:40.444190 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444288 kubelet[2938]: E0912 22:53:40.444200 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.444288 kubelet[2938]: E0912 22:53:40.444282 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444288 kubelet[2938]: W0912 22:53:40.444287 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444366 kubelet[2938]: E0912 22:53:40.444292 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.444386 kubelet[2938]: E0912 22:53:40.444382 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444406 kubelet[2938]: W0912 22:53:40.444386 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444406 kubelet[2938]: E0912 22:53:40.444391 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.444469 kubelet[2938]: E0912 22:53:40.444460 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444469 kubelet[2938]: W0912 22:53:40.444467 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444551 kubelet[2938]: E0912 22:53:40.444474 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.444577 kubelet[2938]: E0912 22:53:40.444557 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444577 kubelet[2938]: W0912 22:53:40.444562 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444577 kubelet[2938]: E0912 22:53:40.444569 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.444640 kubelet[2938]: E0912 22:53:40.444631 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444640 kubelet[2938]: W0912 22:53:40.444637 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444714 kubelet[2938]: E0912 22:53:40.444641 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.444736 kubelet[2938]: E0912 22:53:40.444720 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444736 kubelet[2938]: W0912 22:53:40.444724 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444736 kubelet[2938]: E0912 22:53:40.444729 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.444804 kubelet[2938]: I0912 22:53:40.444739 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f-kubelet-dir\") pod \"csi-node-driver-7nf98\" (UID: \"25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f\") " pod="calico-system/csi-node-driver-7nf98" Sep 12 22:53:40.444836 kubelet[2938]: E0912 22:53:40.444825 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.444836 kubelet[2938]: W0912 22:53:40.444833 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.444949 kubelet[2938]: E0912 22:53:40.444842 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.445018 kubelet[2938]: E0912 22:53:40.444982 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.445018 kubelet[2938]: W0912 22:53:40.444989 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.445018 kubelet[2938]: E0912 22:53:40.445000 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.445192 kubelet[2938]: E0912 22:53:40.445155 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.445192 kubelet[2938]: W0912 22:53:40.445161 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.445192 kubelet[2938]: E0912 22:53:40.445166 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.445321 kubelet[2938]: E0912 22:53:40.445316 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.445363 kubelet[2938]: W0912 22:53:40.445347 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.445363 kubelet[2938]: E0912 22:53:40.445354 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.545111 kubelet[2938]: E0912 22:53:40.545008 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.545111 kubelet[2938]: W0912 22:53:40.545074 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.545200 kubelet[2938]: E0912 22:53:40.545182 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.546937 kubelet[2938]: E0912 22:53:40.545288 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.546937 kubelet[2938]: W0912 22:53:40.545295 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.546937 kubelet[2938]: E0912 22:53:40.545301 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.546937 kubelet[2938]: E0912 22:53:40.545577 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.546937 kubelet[2938]: W0912 22:53:40.545583 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.546937 kubelet[2938]: E0912 22:53:40.545589 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.546937 kubelet[2938]: E0912 22:53:40.545827 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.546937 kubelet[2938]: W0912 22:53:40.545866 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.546937 kubelet[2938]: E0912 22:53:40.545873 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.546937 kubelet[2938]: E0912 22:53:40.546187 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547181 kubelet[2938]: W0912 22:53:40.546280 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547181 kubelet[2938]: E0912 22:53:40.546289 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547181 kubelet[2938]: E0912 22:53:40.546542 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547181 kubelet[2938]: W0912 22:53:40.546547 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547181 kubelet[2938]: E0912 22:53:40.546580 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547181 kubelet[2938]: E0912 22:53:40.546803 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547181 kubelet[2938]: W0912 22:53:40.546808 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547181 kubelet[2938]: E0912 22:53:40.546814 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547181 kubelet[2938]: E0912 22:53:40.547001 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547181 kubelet[2938]: W0912 22:53:40.547006 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547335 kubelet[2938]: E0912 22:53:40.547011 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547335 kubelet[2938]: E0912 22:53:40.547112 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547335 kubelet[2938]: W0912 22:53:40.547116 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547335 kubelet[2938]: E0912 22:53:40.547133 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547335 kubelet[2938]: E0912 22:53:40.547237 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547335 kubelet[2938]: W0912 22:53:40.547242 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547335 kubelet[2938]: E0912 22:53:40.547246 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547441 kubelet[2938]: E0912 22:53:40.547339 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547441 kubelet[2938]: W0912 22:53:40.547344 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547441 kubelet[2938]: E0912 22:53:40.547349 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547441 kubelet[2938]: E0912 22:53:40.547421 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547441 kubelet[2938]: W0912 22:53:40.547425 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547441 kubelet[2938]: E0912 22:53:40.547429 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547530 kubelet[2938]: E0912 22:53:40.547502 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547530 kubelet[2938]: W0912 22:53:40.547506 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547530 kubelet[2938]: E0912 22:53:40.547510 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547643 kubelet[2938]: E0912 22:53:40.547624 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547643 kubelet[2938]: W0912 22:53:40.547631 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547643 kubelet[2938]: E0912 22:53:40.547639 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.547818 kubelet[2938]: E0912 22:53:40.547718 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.547818 kubelet[2938]: W0912 22:53:40.547724 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.547818 kubelet[2938]: E0912 22:53:40.547729 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.548634 kubelet[2938]: E0912 22:53:40.547819 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.548634 kubelet[2938]: W0912 22:53:40.547824 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.548634 kubelet[2938]: E0912 22:53:40.547828 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.548634 kubelet[2938]: E0912 22:53:40.547897 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.548634 kubelet[2938]: W0912 22:53:40.547901 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.548634 kubelet[2938]: E0912 22:53:40.547906 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.548634 kubelet[2938]: E0912 22:53:40.548003 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.548634 kubelet[2938]: W0912 22:53:40.548008 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.548634 kubelet[2938]: E0912 22:53:40.548013 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.548634 kubelet[2938]: E0912 22:53:40.548273 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.548833 kubelet[2938]: W0912 22:53:40.548277 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.548833 kubelet[2938]: E0912 22:53:40.548283 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.548833 kubelet[2938]: E0912 22:53:40.548354 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.548833 kubelet[2938]: W0912 22:53:40.548359 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.548833 kubelet[2938]: E0912 22:53:40.548363 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.548833 kubelet[2938]: E0912 22:53:40.548432 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.548833 kubelet[2938]: W0912 22:53:40.548437 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.548833 kubelet[2938]: E0912 22:53:40.548442 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.549759 kubelet[2938]: E0912 22:53:40.549061 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.549759 kubelet[2938]: W0912 22:53:40.549070 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.549759 kubelet[2938]: E0912 22:53:40.549078 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.549759 kubelet[2938]: E0912 22:53:40.549693 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.549759 kubelet[2938]: W0912 22:53:40.549700 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.549759 kubelet[2938]: E0912 22:53:40.549706 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.549961 kubelet[2938]: E0912 22:53:40.549801 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.549961 kubelet[2938]: W0912 22:53:40.549806 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.549961 kubelet[2938]: E0912 22:53:40.549811 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.550123 kubelet[2938]: E0912 22:53:40.550096 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.550123 kubelet[2938]: W0912 22:53:40.550102 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.550123 kubelet[2938]: E0912 22:53:40.550108 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.554322 kubelet[2938]: E0912 22:53:40.553619 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.554322 kubelet[2938]: W0912 22:53:40.553631 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.554322 kubelet[2938]: E0912 22:53:40.553656 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.556145 kubelet[2938]: E0912 22:53:40.556136 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.556208 kubelet[2938]: W0912 22:53:40.556200 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.558424 kubelet[2938]: E0912 22:53:40.556277 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.647935 kubelet[2938]: E0912 22:53:40.647881 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.647935 kubelet[2938]: W0912 22:53:40.647897 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.647935 kubelet[2938]: E0912 22:53:40.647910 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.749220 kubelet[2938]: E0912 22:53:40.748345 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.749220 kubelet[2938]: W0912 22:53:40.748361 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.749220 kubelet[2938]: E0912 22:53:40.748377 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.848873 kubelet[2938]: E0912 22:53:40.848791 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.848873 kubelet[2938]: W0912 22:53:40.848807 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.848873 kubelet[2938]: E0912 22:53:40.848826 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:40.949768 kubelet[2938]: E0912 22:53:40.949743 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:40.949768 kubelet[2938]: W0912 22:53:40.949762 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:40.949913 kubelet[2938]: E0912 22:53:40.949781 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:41.051154 kubelet[2938]: E0912 22:53:41.051092 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:41.051154 kubelet[2938]: W0912 22:53:41.051111 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:41.051154 kubelet[2938]: E0912 22:53:41.051128 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:41.152165 kubelet[2938]: E0912 22:53:41.152023 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:41.152165 kubelet[2938]: W0912 22:53:41.152062 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:41.152165 kubelet[2938]: E0912 22:53:41.152080 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:41.252755 kubelet[2938]: E0912 22:53:41.252696 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:41.252755 kubelet[2938]: W0912 22:53:41.252712 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:41.252755 kubelet[2938]: E0912 22:53:41.252727 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:41.317890 kubelet[2938]: E0912 22:53:41.317861 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:41.317890 kubelet[2938]: W0912 22:53:41.317874 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:41.317890 kubelet[2938]: E0912 22:53:41.317889 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:41.340900 containerd[1640]: time="2025-09-12T22:53:41.340854840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ps95x,Uid:a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1,Namespace:calico-system,Attempt:0,}" Sep 12 22:53:41.353710 containerd[1640]: time="2025-09-12T22:53:41.353619081Z" level=info msg="connecting to shim 47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a" address="unix:///run/containerd/s/ff522ef6110611581dc1b0b56e2ba8edad2f1c7b4158e8b6d414d88a19d46d39" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:53:41.381200 systemd[1]: Started cri-containerd-47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a.scope - libcontainer container 47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a. Sep 12 22:53:41.419831 containerd[1640]: time="2025-09-12T22:53:41.419754109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ps95x,Uid:a1ab6f66-9d67-48b4-a1d3-5d0a253aa8b1,Namespace:calico-system,Attempt:0,} returns sandbox id \"47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a\"" Sep 12 22:53:41.647390 kubelet[2938]: E0912 22:53:41.647160 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7nf98" podUID="25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f" Sep 12 22:53:42.170107 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount984998372.mount: Deactivated successfully. Sep 12 22:53:42.871979 containerd[1640]: time="2025-09-12T22:53:42.871950914Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:42.875514 containerd[1640]: time="2025-09-12T22:53:42.875478059Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 22:53:42.876683 containerd[1640]: time="2025-09-12T22:53:42.875551598Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:42.877571 containerd[1640]: time="2025-09-12T22:53:42.877554666Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.44442889s" Sep 12 22:53:42.877619 containerd[1640]: time="2025-09-12T22:53:42.877611739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 22:53:42.877795 containerd[1640]: time="2025-09-12T22:53:42.877778242Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:42.889420 containerd[1640]: time="2025-09-12T22:53:42.889192375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 22:53:42.903208 containerd[1640]: time="2025-09-12T22:53:42.903186593Z" level=info msg="CreateContainer within sandbox \"e6c873c15828f4309d667874350a47664d8639661ca53782cf8debc15d465341\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 22:53:42.936550 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2968659156.mount: Deactivated successfully. Sep 12 22:53:42.937647 containerd[1640]: time="2025-09-12T22:53:42.937354496Z" level=info msg="Container 1dc9e49777679412b8735f6e779def783448dedc2c564abf5eee61652d1cff63: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:42.977165 containerd[1640]: time="2025-09-12T22:53:42.977100624Z" level=info msg="CreateContainer within sandbox \"e6c873c15828f4309d667874350a47664d8639661ca53782cf8debc15d465341\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"1dc9e49777679412b8735f6e779def783448dedc2c564abf5eee61652d1cff63\"" Sep 12 22:53:42.977580 containerd[1640]: time="2025-09-12T22:53:42.977566066Z" level=info msg="StartContainer for \"1dc9e49777679412b8735f6e779def783448dedc2c564abf5eee61652d1cff63\"" Sep 12 22:53:42.978361 containerd[1640]: time="2025-09-12T22:53:42.978296338Z" level=info msg="connecting to shim 1dc9e49777679412b8735f6e779def783448dedc2c564abf5eee61652d1cff63" address="unix:///run/containerd/s/153cc24d7cb9c13a5fdaf3899c099df5a6edc0f09a92f693b25b28b1e89e44a4" protocol=ttrpc version=3 Sep 12 22:53:42.996190 systemd[1]: Started cri-containerd-1dc9e49777679412b8735f6e779def783448dedc2c564abf5eee61652d1cff63.scope - libcontainer container 1dc9e49777679412b8735f6e779def783448dedc2c564abf5eee61652d1cff63. Sep 12 22:53:43.031987 containerd[1640]: time="2025-09-12T22:53:43.031962758Z" level=info msg="StartContainer for \"1dc9e49777679412b8735f6e779def783448dedc2c564abf5eee61652d1cff63\" returns successfully" Sep 12 22:53:43.654012 kubelet[2938]: E0912 22:53:43.653977 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7nf98" podUID="25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f" Sep 12 22:53:43.786611 kubelet[2938]: I0912 22:53:43.786567 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-568c494585-clkmz" podStartSLOduration=2.3298969019999998 podStartE2EDuration="4.786554285s" podCreationTimestamp="2025-09-12 22:53:39 +0000 UTC" firstStartedPulling="2025-09-12 22:53:40.432463461 +0000 UTC m=+19.164968033" lastFinishedPulling="2025-09-12 22:53:42.889120843 +0000 UTC m=+21.621625416" observedRunningTime="2025-09-12 22:53:43.785704593 +0000 UTC m=+22.518209174" watchObservedRunningTime="2025-09-12 22:53:43.786554285 +0000 UTC m=+22.519058862" Sep 12 22:53:43.795094 kubelet[2938]: E0912 22:53:43.795023 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.795094 kubelet[2938]: W0912 22:53:43.795061 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.795094 kubelet[2938]: E0912 22:53:43.795076 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.795226 kubelet[2938]: E0912 22:53:43.795213 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.795226 kubelet[2938]: W0912 22:53:43.795222 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.795268 kubelet[2938]: E0912 22:53:43.795227 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.795318 kubelet[2938]: E0912 22:53:43.795307 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.795318 kubelet[2938]: W0912 22:53:43.795314 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.795359 kubelet[2938]: E0912 22:53:43.795320 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.795409 kubelet[2938]: E0912 22:53:43.795398 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.795409 kubelet[2938]: W0912 22:53:43.795405 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.795451 kubelet[2938]: E0912 22:53:43.795410 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.795510 kubelet[2938]: E0912 22:53:43.795499 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.795535 kubelet[2938]: W0912 22:53:43.795510 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.795535 kubelet[2938]: E0912 22:53:43.795516 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.795674 kubelet[2938]: E0912 22:53:43.795662 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.795674 kubelet[2938]: W0912 22:53:43.795670 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.795720 kubelet[2938]: E0912 22:53:43.795675 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.795874 kubelet[2938]: E0912 22:53:43.795862 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.795874 kubelet[2938]: W0912 22:53:43.795870 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.795915 kubelet[2938]: E0912 22:53:43.795876 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.795969 kubelet[2938]: E0912 22:53:43.795957 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.796049 kubelet[2938]: W0912 22:53:43.796027 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.796074 kubelet[2938]: E0912 22:53:43.796053 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.796151 kubelet[2938]: E0912 22:53:43.796140 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.796176 kubelet[2938]: W0912 22:53:43.796149 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.796176 kubelet[2938]: E0912 22:53:43.796172 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.796262 kubelet[2938]: E0912 22:53:43.796251 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.796286 kubelet[2938]: W0912 22:53:43.796263 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.796286 kubelet[2938]: E0912 22:53:43.796268 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.796442 kubelet[2938]: E0912 22:53:43.796431 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.796442 kubelet[2938]: W0912 22:53:43.796442 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.796483 kubelet[2938]: E0912 22:53:43.796447 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.796587 kubelet[2938]: E0912 22:53:43.796577 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.796587 kubelet[2938]: W0912 22:53:43.796584 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.796628 kubelet[2938]: E0912 22:53:43.796589 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.796722 kubelet[2938]: E0912 22:53:43.796711 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.796722 kubelet[2938]: W0912 22:53:43.796719 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.796762 kubelet[2938]: E0912 22:53:43.796724 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.796871 kubelet[2938]: E0912 22:53:43.796860 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.796871 kubelet[2938]: W0912 22:53:43.796869 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.796914 kubelet[2938]: E0912 22:53:43.796874 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.797027 kubelet[2938]: E0912 22:53:43.797016 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.797027 kubelet[2938]: W0912 22:53:43.797024 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.797098 kubelet[2938]: E0912 22:53:43.797029 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.895547 kubelet[2938]: E0912 22:53:43.895393 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.895547 kubelet[2938]: W0912 22:53:43.895434 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.895547 kubelet[2938]: E0912 22:53:43.895452 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.895948 kubelet[2938]: E0912 22:53:43.895881 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.895948 kubelet[2938]: W0912 22:53:43.895889 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.895948 kubelet[2938]: E0912 22:53:43.895904 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.901661 kubelet[2938]: E0912 22:53:43.896024 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.901661 kubelet[2938]: W0912 22:53:43.896033 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.901661 kubelet[2938]: E0912 22:53:43.896057 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.901661 kubelet[2938]: E0912 22:53:43.896140 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.901661 kubelet[2938]: W0912 22:53:43.896145 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.901661 kubelet[2938]: E0912 22:53:43.896150 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.901661 kubelet[2938]: E0912 22:53:43.896216 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.901661 kubelet[2938]: W0912 22:53:43.896220 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.901661 kubelet[2938]: E0912 22:53:43.896225 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.901661 kubelet[2938]: E0912 22:53:43.896346 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.901971 kubelet[2938]: W0912 22:53:43.896351 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.901971 kubelet[2938]: E0912 22:53:43.896359 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.901971 kubelet[2938]: E0912 22:53:43.896476 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.901971 kubelet[2938]: W0912 22:53:43.896482 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.901971 kubelet[2938]: E0912 22:53:43.896490 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.901971 kubelet[2938]: E0912 22:53:43.896605 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.901971 kubelet[2938]: W0912 22:53:43.896611 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.901971 kubelet[2938]: E0912 22:53:43.896617 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.901971 kubelet[2938]: E0912 22:53:43.896727 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.901971 kubelet[2938]: W0912 22:53:43.896732 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.902253 kubelet[2938]: E0912 22:53:43.896745 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.902253 kubelet[2938]: E0912 22:53:43.896892 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.902253 kubelet[2938]: W0912 22:53:43.896898 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.902253 kubelet[2938]: E0912 22:53:43.896909 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.902253 kubelet[2938]: E0912 22:53:43.897019 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.902253 kubelet[2938]: W0912 22:53:43.897023 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.902253 kubelet[2938]: E0912 22:53:43.897033 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.902253 kubelet[2938]: E0912 22:53:43.897144 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.902253 kubelet[2938]: W0912 22:53:43.897149 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.902253 kubelet[2938]: E0912 22:53:43.897156 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.902481 kubelet[2938]: E0912 22:53:43.897260 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.902481 kubelet[2938]: W0912 22:53:43.897265 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.902481 kubelet[2938]: E0912 22:53:43.897421 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.902481 kubelet[2938]: E0912 22:53:43.897468 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.902481 kubelet[2938]: W0912 22:53:43.897473 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.902481 kubelet[2938]: E0912 22:53:43.897487 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.902481 kubelet[2938]: E0912 22:53:43.897595 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.902481 kubelet[2938]: W0912 22:53:43.897599 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.902481 kubelet[2938]: E0912 22:53:43.897604 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.902481 kubelet[2938]: E0912 22:53:43.897716 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.911365 kubelet[2938]: W0912 22:53:43.897722 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.911365 kubelet[2938]: E0912 22:53:43.897727 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.911365 kubelet[2938]: E0912 22:53:43.897845 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.911365 kubelet[2938]: W0912 22:53:43.897869 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.911365 kubelet[2938]: E0912 22:53:43.897879 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:43.911365 kubelet[2938]: E0912 22:53:43.898141 2938 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:53:43.911365 kubelet[2938]: W0912 22:53:43.898148 2938 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:53:43.911365 kubelet[2938]: E0912 22:53:43.898154 2938 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:53:44.498921 containerd[1640]: time="2025-09-12T22:53:44.498519734Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:44.499487 containerd[1640]: time="2025-09-12T22:53:44.499463782Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 22:53:44.499791 containerd[1640]: time="2025-09-12T22:53:44.499779426Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:44.501350 containerd[1640]: time="2025-09-12T22:53:44.501333507Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:44.501978 containerd[1640]: time="2025-09-12T22:53:44.501637445Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.612421209s" Sep 12 22:53:44.501978 containerd[1640]: time="2025-09-12T22:53:44.501829368Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 22:53:44.503423 containerd[1640]: time="2025-09-12T22:53:44.503408074Z" level=info msg="CreateContainer within sandbox \"47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 22:53:44.544549 containerd[1640]: time="2025-09-12T22:53:44.543876628Z" level=info msg="Container 6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:44.546031 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1180442666.mount: Deactivated successfully. Sep 12 22:53:44.588675 containerd[1640]: time="2025-09-12T22:53:44.588529678Z" level=info msg="CreateContainer within sandbox \"47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889\"" Sep 12 22:53:44.590066 containerd[1640]: time="2025-09-12T22:53:44.589221688Z" level=info msg="StartContainer for \"6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889\"" Sep 12 22:53:44.590186 containerd[1640]: time="2025-09-12T22:53:44.590171717Z" level=info msg="connecting to shim 6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889" address="unix:///run/containerd/s/ff522ef6110611581dc1b0b56e2ba8edad2f1c7b4158e8b6d414d88a19d46d39" protocol=ttrpc version=3 Sep 12 22:53:44.609178 systemd[1]: Started cri-containerd-6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889.scope - libcontainer container 6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889. Sep 12 22:53:44.648063 containerd[1640]: time="2025-09-12T22:53:44.647972407Z" level=info msg="StartContainer for \"6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889\" returns successfully" Sep 12 22:53:44.654054 systemd[1]: cri-containerd-6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889.scope: Deactivated successfully. Sep 12 22:53:44.662716 containerd[1640]: time="2025-09-12T22:53:44.662677484Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889\" id:\"6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889\" pid:3655 exited_at:{seconds:1757717624 nanos:656388457}" Sep 12 22:53:44.663718 containerd[1640]: time="2025-09-12T22:53:44.663687151Z" level=info msg="received exit event container_id:\"6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889\" id:\"6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889\" pid:3655 exited_at:{seconds:1757717624 nanos:656388457}" Sep 12 22:53:44.681121 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6d6537810139d1fbbd11acdb9a1f208ef8eb559ef275af71b64d8d100f3e6889-rootfs.mount: Deactivated successfully. Sep 12 22:53:44.874293 kubelet[2938]: I0912 22:53:44.873977 2938 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:53:45.646729 kubelet[2938]: E0912 22:53:45.646491 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7nf98" podUID="25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f" Sep 12 22:53:45.783429 containerd[1640]: time="2025-09-12T22:53:45.783405192Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 22:53:47.664524 kubelet[2938]: E0912 22:53:47.664475 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7nf98" podUID="25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f" Sep 12 22:53:49.646280 kubelet[2938]: E0912 22:53:49.646251 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7nf98" podUID="25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f" Sep 12 22:53:49.989573 containerd[1640]: time="2025-09-12T22:53:49.989514928Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:49.990270 containerd[1640]: time="2025-09-12T22:53:49.990124745Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 22:53:49.990582 containerd[1640]: time="2025-09-12T22:53:49.990546584Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:49.992733 containerd[1640]: time="2025-09-12T22:53:49.992091861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:49.992733 containerd[1640]: time="2025-09-12T22:53:49.992655311Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.209227581s" Sep 12 22:53:49.992733 containerd[1640]: time="2025-09-12T22:53:49.992676415Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 22:53:49.994595 containerd[1640]: time="2025-09-12T22:53:49.994572092Z" level=info msg="CreateContainer within sandbox \"47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 22:53:50.002284 containerd[1640]: time="2025-09-12T22:53:50.002236488Z" level=info msg="Container 672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:50.010255 containerd[1640]: time="2025-09-12T22:53:50.010222732Z" level=info msg="CreateContainer within sandbox \"47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b\"" Sep 12 22:53:50.015141 containerd[1640]: time="2025-09-12T22:53:50.015067584Z" level=info msg="StartContainer for \"672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b\"" Sep 12 22:53:50.016611 containerd[1640]: time="2025-09-12T22:53:50.016568935Z" level=info msg="connecting to shim 672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b" address="unix:///run/containerd/s/ff522ef6110611581dc1b0b56e2ba8edad2f1c7b4158e8b6d414d88a19d46d39" protocol=ttrpc version=3 Sep 12 22:53:50.036180 systemd[1]: Started cri-containerd-672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b.scope - libcontainer container 672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b. Sep 12 22:53:50.112328 containerd[1640]: time="2025-09-12T22:53:50.112293919Z" level=info msg="StartContainer for \"672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b\" returns successfully" Sep 12 22:53:51.750347 kubelet[2938]: E0912 22:53:51.750272 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7nf98" podUID="25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f" Sep 12 22:53:52.111542 systemd[1]: cri-containerd-672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b.scope: Deactivated successfully. Sep 12 22:53:52.111787 systemd[1]: cri-containerd-672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b.scope: Consumed 312ms CPU time, 156.7M memory peak, 12K read from disk, 171.3M written to disk. Sep 12 22:53:52.115721 containerd[1640]: time="2025-09-12T22:53:52.114922802Z" level=info msg="received exit event container_id:\"672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b\" id:\"672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b\" pid:3714 exited_at:{seconds:1757717632 nanos:114736600}" Sep 12 22:53:52.117599 containerd[1640]: time="2025-09-12T22:53:52.117211504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b\" id:\"672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b\" pid:3714 exited_at:{seconds:1757717632 nanos:114736600}" Sep 12 22:53:52.143984 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-672d76dac7b0cf8a8664a01404dfbe233b649a9d29f62af508dfd2399e2a4e2b-rootfs.mount: Deactivated successfully. Sep 12 22:53:52.239294 kubelet[2938]: I0912 22:53:52.239142 2938 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 22:53:52.457611 systemd[1]: Created slice kubepods-besteffort-pod96c365fa_2018_4f8d_9c8f_e2c34613731b.slice - libcontainer container kubepods-besteffort-pod96c365fa_2018_4f8d_9c8f_e2c34613731b.slice. Sep 12 22:53:52.461303 systemd[1]: Created slice kubepods-besteffort-pod035f49fb_08d5_4057_94ce_b42b7f9e85b8.slice - libcontainer container kubepods-besteffort-pod035f49fb_08d5_4057_94ce_b42b7f9e85b8.slice. Sep 12 22:53:52.466541 systemd[1]: Created slice kubepods-burstable-pod2f3a9207_5186_4087_a436_20d5942dd273.slice - libcontainer container kubepods-burstable-pod2f3a9207_5186_4087_a436_20d5942dd273.slice. Sep 12 22:53:52.506980 systemd[1]: Created slice kubepods-besteffort-pod8e984712_7649_4b52_afb2_10ca87cddfeb.slice - libcontainer container kubepods-besteffort-pod8e984712_7649_4b52_afb2_10ca87cddfeb.slice. Sep 12 22:53:52.511152 systemd[1]: Created slice kubepods-besteffort-podc5490633_0f26_4c96_a32c_8a889e99267c.slice - libcontainer container kubepods-besteffort-podc5490633_0f26_4c96_a32c_8a889e99267c.slice. Sep 12 22:53:52.548715 kubelet[2938]: I0912 22:53:52.522552 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqh22\" (UniqueName: \"kubernetes.io/projected/8e984712-7649-4b52-afb2-10ca87cddfeb-kube-api-access-rqh22\") pod \"calico-apiserver-57c58cd6b4-bbb9r\" (UID: \"8e984712-7649-4b52-afb2-10ca87cddfeb\") " pod="calico-apiserver/calico-apiserver-57c58cd6b4-bbb9r" Sep 12 22:53:52.548715 kubelet[2938]: I0912 22:53:52.522582 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f3a9207-5186-4087-a436-20d5942dd273-config-volume\") pod \"coredns-7c65d6cfc9-8nsqz\" (UID: \"2f3a9207-5186-4087-a436-20d5942dd273\") " pod="kube-system/coredns-7c65d6cfc9-8nsqz" Sep 12 22:53:52.548715 kubelet[2938]: I0912 22:53:52.522599 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bbbx\" (UniqueName: \"kubernetes.io/projected/1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c-kube-api-access-2bbbx\") pod \"goldmane-7988f88666-vzktr\" (UID: \"1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c\") " pod="calico-system/goldmane-7988f88666-vzktr" Sep 12 22:53:52.548715 kubelet[2938]: I0912 22:53:52.522609 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d-config-volume\") pod \"coredns-7c65d6cfc9-2hz5l\" (UID: \"ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d\") " pod="kube-system/coredns-7c65d6cfc9-2hz5l" Sep 12 22:53:52.548715 kubelet[2938]: I0912 22:53:52.522619 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c5490633-0f26-4c96-a32c-8a889e99267c-calico-apiserver-certs\") pod \"calico-apiserver-57c58cd6b4-k9wjk\" (UID: \"c5490633-0f26-4c96-a32c-8a889e99267c\") " pod="calico-apiserver/calico-apiserver-57c58cd6b4-k9wjk" Sep 12 22:53:52.514050 systemd[1]: Created slice kubepods-besteffort-pod1a47fb8c_98ea_48aa_b27d_2fd1d5e34b1c.slice - libcontainer container kubepods-besteffort-pod1a47fb8c_98ea_48aa_b27d_2fd1d5e34b1c.slice. Sep 12 22:53:52.582131 kubelet[2938]: I0912 22:53:52.522644 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035f49fb-08d5-4057-94ce-b42b7f9e85b8-whisker-ca-bundle\") pod \"whisker-7b79568488-kdw5r\" (UID: \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\") " pod="calico-system/whisker-7b79568488-kdw5r" Sep 12 22:53:52.582131 kubelet[2938]: I0912 22:53:52.522675 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnmf7\" (UniqueName: \"kubernetes.io/projected/2f3a9207-5186-4087-a436-20d5942dd273-kube-api-access-qnmf7\") pod \"coredns-7c65d6cfc9-8nsqz\" (UID: \"2f3a9207-5186-4087-a436-20d5942dd273\") " pod="kube-system/coredns-7c65d6cfc9-8nsqz" Sep 12 22:53:52.582131 kubelet[2938]: I0912 22:53:52.522693 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c-goldmane-key-pair\") pod \"goldmane-7988f88666-vzktr\" (UID: \"1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c\") " pod="calico-system/goldmane-7988f88666-vzktr" Sep 12 22:53:52.582131 kubelet[2938]: I0912 22:53:52.522707 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c365fa-2018-4f8d-9c8f-e2c34613731b-tigera-ca-bundle\") pod \"calico-kube-controllers-c4b5b494-fw8d5\" (UID: \"96c365fa-2018-4f8d-9c8f-e2c34613731b\") " pod="calico-system/calico-kube-controllers-c4b5b494-fw8d5" Sep 12 22:53:52.582131 kubelet[2938]: I0912 22:53:52.522716 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmsj\" (UniqueName: \"kubernetes.io/projected/96c365fa-2018-4f8d-9c8f-e2c34613731b-kube-api-access-mvmsj\") pod \"calico-kube-controllers-c4b5b494-fw8d5\" (UID: \"96c365fa-2018-4f8d-9c8f-e2c34613731b\") " pod="calico-system/calico-kube-controllers-c4b5b494-fw8d5" Sep 12 22:53:52.517499 systemd[1]: Created slice kubepods-burstable-podad3f9ce4_1c91_4cc6_92be_f6e9a6ca4e2d.slice - libcontainer container kubepods-burstable-podad3f9ce4_1c91_4cc6_92be_f6e9a6ca4e2d.slice. Sep 12 22:53:52.594937 kubelet[2938]: I0912 22:53:52.522736 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqknn\" (UniqueName: \"kubernetes.io/projected/ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d-kube-api-access-pqknn\") pod \"coredns-7c65d6cfc9-2hz5l\" (UID: \"ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d\") " pod="kube-system/coredns-7c65d6cfc9-2hz5l" Sep 12 22:53:52.594937 kubelet[2938]: I0912 22:53:52.522750 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhgtt\" (UniqueName: \"kubernetes.io/projected/c5490633-0f26-4c96-a32c-8a889e99267c-kube-api-access-nhgtt\") pod \"calico-apiserver-57c58cd6b4-k9wjk\" (UID: \"c5490633-0f26-4c96-a32c-8a889e99267c\") " pod="calico-apiserver/calico-apiserver-57c58cd6b4-k9wjk" Sep 12 22:53:52.594937 kubelet[2938]: I0912 22:53:52.522760 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c-config\") pod \"goldmane-7988f88666-vzktr\" (UID: \"1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c\") " pod="calico-system/goldmane-7988f88666-vzktr" Sep 12 22:53:52.594937 kubelet[2938]: I0912 22:53:52.522769 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c-goldmane-ca-bundle\") pod \"goldmane-7988f88666-vzktr\" (UID: \"1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c\") " pod="calico-system/goldmane-7988f88666-vzktr" Sep 12 22:53:52.594937 kubelet[2938]: I0912 22:53:52.522779 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8e984712-7649-4b52-afb2-10ca87cddfeb-calico-apiserver-certs\") pod \"calico-apiserver-57c58cd6b4-bbb9r\" (UID: \"8e984712-7649-4b52-afb2-10ca87cddfeb\") " pod="calico-apiserver/calico-apiserver-57c58cd6b4-bbb9r" Sep 12 22:53:52.595060 kubelet[2938]: I0912 22:53:52.522788 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7b9\" (UniqueName: \"kubernetes.io/projected/035f49fb-08d5-4057-94ce-b42b7f9e85b8-kube-api-access-qx7b9\") pod \"whisker-7b79568488-kdw5r\" (UID: \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\") " pod="calico-system/whisker-7b79568488-kdw5r" Sep 12 22:53:52.595060 kubelet[2938]: I0912 22:53:52.522798 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/035f49fb-08d5-4057-94ce-b42b7f9e85b8-whisker-backend-key-pair\") pod \"whisker-7b79568488-kdw5r\" (UID: \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\") " pod="calico-system/whisker-7b79568488-kdw5r" Sep 12 22:53:52.765915 containerd[1640]: time="2025-09-12T22:53:52.765839316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c4b5b494-fw8d5,Uid:96c365fa-2018-4f8d-9c8f-e2c34613731b,Namespace:calico-system,Attempt:0,}" Sep 12 22:53:52.766412 containerd[1640]: time="2025-09-12T22:53:52.766388580Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b79568488-kdw5r,Uid:035f49fb-08d5-4057-94ce-b42b7f9e85b8,Namespace:calico-system,Attempt:0,}" Sep 12 22:53:52.805536 containerd[1640]: time="2025-09-12T22:53:52.805501401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8nsqz,Uid:2f3a9207-5186-4087-a436-20d5942dd273,Namespace:kube-system,Attempt:0,}" Sep 12 22:53:52.812382 containerd[1640]: time="2025-09-12T22:53:52.809331096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57c58cd6b4-bbb9r,Uid:8e984712-7649-4b52-afb2-10ca87cddfeb,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:53:52.818618 containerd[1640]: time="2025-09-12T22:53:52.818498844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-vzktr,Uid:1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c,Namespace:calico-system,Attempt:0,}" Sep 12 22:53:52.825540 containerd[1640]: time="2025-09-12T22:53:52.825483080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2hz5l,Uid:ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d,Namespace:kube-system,Attempt:0,}" Sep 12 22:53:52.826480 containerd[1640]: time="2025-09-12T22:53:52.825958903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57c58cd6b4-k9wjk,Uid:c5490633-0f26-4c96-a32c-8a889e99267c,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:53:53.021813 containerd[1640]: time="2025-09-12T22:53:53.021688045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 22:53:53.270610 containerd[1640]: time="2025-09-12T22:53:53.270485141Z" level=error msg="Failed to destroy network for sandbox \"6c510d40995dc493912eb79a0e7f0a0b9c77d459256cbee941bd2608998841a7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.272964 systemd[1]: run-netns-cni\x2d4e31b8c8\x2da741\x2db8e3\x2d360b\x2da5826e9674d3.mount: Deactivated successfully. Sep 12 22:53:53.282068 containerd[1640]: time="2025-09-12T22:53:53.281894027Z" level=error msg="Failed to destroy network for sandbox \"189ead02455e89ec7eeccce253a6de310dc0eaf9c23150845b0e222f221702b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.283994 systemd[1]: run-netns-cni\x2dd5292c0b\x2de8b5\x2d9e21\x2dbf75\x2d196871f38635.mount: Deactivated successfully. Sep 12 22:53:53.296445 containerd[1640]: time="2025-09-12T22:53:53.296219614Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57c58cd6b4-k9wjk,Uid:c5490633-0f26-4c96-a32c-8a889e99267c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c510d40995dc493912eb79a0e7f0a0b9c77d459256cbee941bd2608998841a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.297383 containerd[1640]: time="2025-09-12T22:53:53.297363307Z" level=error msg="Failed to destroy network for sandbox \"3d82983fb3f0691d4daf32c33d5ec8420f423ae5ad048f47aa3b21bd5a33afc4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.299518 systemd[1]: run-netns-cni\x2dfb6341d1\x2d753f\x2d0785\x2dc6be\x2ddb68339a7571.mount: Deactivated successfully. Sep 12 22:53:53.301179 containerd[1640]: time="2025-09-12T22:53:53.300453592Z" level=error msg="Failed to destroy network for sandbox \"c70a52c92cabbf08c380bda41eb12b64d67391898664dede897fb7f2fd3081e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.302407 systemd[1]: run-netns-cni\x2dd38a7c0d\x2d8d30\x2d3bd6\x2db82e\x2d7c58dcdb1be0.mount: Deactivated successfully. Sep 12 22:53:53.303994 containerd[1640]: time="2025-09-12T22:53:53.303972921Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b79568488-kdw5r,Uid:035f49fb-08d5-4057-94ce-b42b7f9e85b8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"189ead02455e89ec7eeccce253a6de310dc0eaf9c23150845b0e222f221702b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.310892 kubelet[2938]: E0912 22:53:53.304469 2938 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c510d40995dc493912eb79a0e7f0a0b9c77d459256cbee941bd2608998841a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.310892 kubelet[2938]: E0912 22:53:53.304535 2938 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c510d40995dc493912eb79a0e7f0a0b9c77d459256cbee941bd2608998841a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57c58cd6b4-k9wjk" Sep 12 22:53:53.310892 kubelet[2938]: E0912 22:53:53.304549 2938 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c510d40995dc493912eb79a0e7f0a0b9c77d459256cbee941bd2608998841a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57c58cd6b4-k9wjk" Sep 12 22:53:53.311271 kubelet[2938]: E0912 22:53:53.304589 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57c58cd6b4-k9wjk_calico-apiserver(c5490633-0f26-4c96-a32c-8a889e99267c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57c58cd6b4-k9wjk_calico-apiserver(c5490633-0f26-4c96-a32c-8a889e99267c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c510d40995dc493912eb79a0e7f0a0b9c77d459256cbee941bd2608998841a7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57c58cd6b4-k9wjk" podUID="c5490633-0f26-4c96-a32c-8a889e99267c" Sep 12 22:53:53.332230 containerd[1640]: time="2025-09-12T22:53:53.332190218Z" level=error msg="Failed to destroy network for sandbox \"f6839e0b926582474b44e3df9e7d6f7b4b529bf5ba36b2e9597bd2250349a877\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.332518 containerd[1640]: time="2025-09-12T22:53:53.332432699Z" level=error msg="Failed to destroy network for sandbox \"af2f1716383b06ec8eea6fba98d0dbbbe9742e4fd5353fea50d0ea30545706f7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.332583 containerd[1640]: time="2025-09-12T22:53:53.332561961Z" level=error msg="Failed to destroy network for sandbox \"13ede584a9ef3c08718d8219cdfdee1c8877993382ea6e50882d3bbb726aa9ff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.336895 containerd[1640]: time="2025-09-12T22:53:53.334529180Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57c58cd6b4-bbb9r,Uid:8e984712-7649-4b52-afb2-10ca87cddfeb,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d82983fb3f0691d4daf32c33d5ec8420f423ae5ad048f47aa3b21bd5a33afc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.336961 kubelet[2938]: E0912 22:53:53.332756 2938 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"189ead02455e89ec7eeccce253a6de310dc0eaf9c23150845b0e222f221702b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.336961 kubelet[2938]: E0912 22:53:53.332797 2938 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"189ead02455e89ec7eeccce253a6de310dc0eaf9c23150845b0e222f221702b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7b79568488-kdw5r" Sep 12 22:53:53.336961 kubelet[2938]: E0912 22:53:53.332812 2938 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"189ead02455e89ec7eeccce253a6de310dc0eaf9c23150845b0e222f221702b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7b79568488-kdw5r" Sep 12 22:53:53.337071 kubelet[2938]: E0912 22:53:53.332849 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7b79568488-kdw5r_calico-system(035f49fb-08d5-4057-94ce-b42b7f9e85b8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7b79568488-kdw5r_calico-system(035f49fb-08d5-4057-94ce-b42b7f9e85b8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"189ead02455e89ec7eeccce253a6de310dc0eaf9c23150845b0e222f221702b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7b79568488-kdw5r" podUID="035f49fb-08d5-4057-94ce-b42b7f9e85b8" Sep 12 22:53:53.337071 kubelet[2938]: E0912 22:53:53.334679 2938 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d82983fb3f0691d4daf32c33d5ec8420f423ae5ad048f47aa3b21bd5a33afc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.337071 kubelet[2938]: E0912 22:53:53.334704 2938 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d82983fb3f0691d4daf32c33d5ec8420f423ae5ad048f47aa3b21bd5a33afc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57c58cd6b4-bbb9r" Sep 12 22:53:53.343244 containerd[1640]: time="2025-09-12T22:53:53.337052738Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c4b5b494-fw8d5,Uid:96c365fa-2018-4f8d-9c8f-e2c34613731b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c70a52c92cabbf08c380bda41eb12b64d67391898664dede897fb7f2fd3081e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.343244 containerd[1640]: time="2025-09-12T22:53:53.341961095Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2hz5l,Uid:ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6839e0b926582474b44e3df9e7d6f7b4b529bf5ba36b2e9597bd2250349a877\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.343312 kubelet[2938]: E0912 22:53:53.334717 2938 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d82983fb3f0691d4daf32c33d5ec8420f423ae5ad048f47aa3b21bd5a33afc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57c58cd6b4-bbb9r" Sep 12 22:53:53.343312 kubelet[2938]: E0912 22:53:53.334743 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57c58cd6b4-bbb9r_calico-apiserver(8e984712-7649-4b52-afb2-10ca87cddfeb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57c58cd6b4-bbb9r_calico-apiserver(8e984712-7649-4b52-afb2-10ca87cddfeb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3d82983fb3f0691d4daf32c33d5ec8420f423ae5ad048f47aa3b21bd5a33afc4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57c58cd6b4-bbb9r" podUID="8e984712-7649-4b52-afb2-10ca87cddfeb" Sep 12 22:53:53.343312 kubelet[2938]: E0912 22:53:53.337228 2938 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c70a52c92cabbf08c380bda41eb12b64d67391898664dede897fb7f2fd3081e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.343393 kubelet[2938]: E0912 22:53:53.337247 2938 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c70a52c92cabbf08c380bda41eb12b64d67391898664dede897fb7f2fd3081e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-c4b5b494-fw8d5" Sep 12 22:53:53.343393 kubelet[2938]: E0912 22:53:53.337258 2938 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c70a52c92cabbf08c380bda41eb12b64d67391898664dede897fb7f2fd3081e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-c4b5b494-fw8d5" Sep 12 22:53:53.343393 kubelet[2938]: E0912 22:53:53.337281 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-c4b5b494-fw8d5_calico-system(96c365fa-2018-4f8d-9c8f-e2c34613731b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-c4b5b494-fw8d5_calico-system(96c365fa-2018-4f8d-9c8f-e2c34613731b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c70a52c92cabbf08c380bda41eb12b64d67391898664dede897fb7f2fd3081e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-c4b5b494-fw8d5" podUID="96c365fa-2018-4f8d-9c8f-e2c34613731b" Sep 12 22:53:53.343461 kubelet[2938]: E0912 22:53:53.342089 2938 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6839e0b926582474b44e3df9e7d6f7b4b529bf5ba36b2e9597bd2250349a877\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.343461 kubelet[2938]: E0912 22:53:53.342131 2938 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6839e0b926582474b44e3df9e7d6f7b4b529bf5ba36b2e9597bd2250349a877\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-2hz5l" Sep 12 22:53:53.343461 kubelet[2938]: E0912 22:53:53.342141 2938 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6839e0b926582474b44e3df9e7d6f7b4b529bf5ba36b2e9597bd2250349a877\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-2hz5l" Sep 12 22:53:53.343514 kubelet[2938]: E0912 22:53:53.342162 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-2hz5l_kube-system(ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-2hz5l_kube-system(ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f6839e0b926582474b44e3df9e7d6f7b4b529bf5ba36b2e9597bd2250349a877\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-2hz5l" podUID="ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d" Sep 12 22:53:53.344905 containerd[1640]: time="2025-09-12T22:53:53.344867900Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8nsqz,Uid:2f3a9207-5186-4087-a436-20d5942dd273,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"af2f1716383b06ec8eea6fba98d0dbbbe9742e4fd5353fea50d0ea30545706f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.345046 kubelet[2938]: E0912 22:53:53.344989 2938 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af2f1716383b06ec8eea6fba98d0dbbbe9742e4fd5353fea50d0ea30545706f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.345046 kubelet[2938]: E0912 22:53:53.345007 2938 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af2f1716383b06ec8eea6fba98d0dbbbe9742e4fd5353fea50d0ea30545706f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8nsqz" Sep 12 22:53:53.345046 kubelet[2938]: E0912 22:53:53.345016 2938 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af2f1716383b06ec8eea6fba98d0dbbbe9742e4fd5353fea50d0ea30545706f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8nsqz" Sep 12 22:53:53.345154 kubelet[2938]: E0912 22:53:53.345133 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-8nsqz_kube-system(2f3a9207-5186-4087-a436-20d5942dd273)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-8nsqz_kube-system(2f3a9207-5186-4087-a436-20d5942dd273)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"af2f1716383b06ec8eea6fba98d0dbbbe9742e4fd5353fea50d0ea30545706f7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8nsqz" podUID="2f3a9207-5186-4087-a436-20d5942dd273" Sep 12 22:53:53.345780 containerd[1640]: time="2025-09-12T22:53:53.345753466Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-vzktr,Uid:1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"13ede584a9ef3c08718d8219cdfdee1c8877993382ea6e50882d3bbb726aa9ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.345957 kubelet[2938]: E0912 22:53:53.345938 2938 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13ede584a9ef3c08718d8219cdfdee1c8877993382ea6e50882d3bbb726aa9ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.346066 kubelet[2938]: E0912 22:53:53.345962 2938 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13ede584a9ef3c08718d8219cdfdee1c8877993382ea6e50882d3bbb726aa9ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-vzktr" Sep 12 22:53:53.346066 kubelet[2938]: E0912 22:53:53.345972 2938 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13ede584a9ef3c08718d8219cdfdee1c8877993382ea6e50882d3bbb726aa9ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-vzktr" Sep 12 22:53:53.346066 kubelet[2938]: E0912 22:53:53.345989 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-vzktr_calico-system(1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-vzktr_calico-system(1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"13ede584a9ef3c08718d8219cdfdee1c8877993382ea6e50882d3bbb726aa9ff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-vzktr" podUID="1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c" Sep 12 22:53:53.651833 systemd[1]: Created slice kubepods-besteffort-pod25ac81e8_ddd8_420b_9cd2_f9ab5f554e5f.slice - libcontainer container kubepods-besteffort-pod25ac81e8_ddd8_420b_9cd2_f9ab5f554e5f.slice. Sep 12 22:53:53.653476 containerd[1640]: time="2025-09-12T22:53:53.653455801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7nf98,Uid:25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f,Namespace:calico-system,Attempt:0,}" Sep 12 22:53:53.691671 containerd[1640]: time="2025-09-12T22:53:53.691625485Z" level=error msg="Failed to destroy network for sandbox \"25f4d542057c598e1f66eb4c288ea13dbf0bdd0316e48119ed188b6506d7f8f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.692149 containerd[1640]: time="2025-09-12T22:53:53.692112376Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7nf98,Uid:25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"25f4d542057c598e1f66eb4c288ea13dbf0bdd0316e48119ed188b6506d7f8f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.692315 kubelet[2938]: E0912 22:53:53.692292 2938 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25f4d542057c598e1f66eb4c288ea13dbf0bdd0316e48119ed188b6506d7f8f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:53:53.692350 kubelet[2938]: E0912 22:53:53.692327 2938 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25f4d542057c598e1f66eb4c288ea13dbf0bdd0316e48119ed188b6506d7f8f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7nf98" Sep 12 22:53:53.692350 kubelet[2938]: E0912 22:53:53.692346 2938 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25f4d542057c598e1f66eb4c288ea13dbf0bdd0316e48119ed188b6506d7f8f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7nf98" Sep 12 22:53:53.692400 kubelet[2938]: E0912 22:53:53.692375 2938 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7nf98_calico-system(25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7nf98_calico-system(25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"25f4d542057c598e1f66eb4c288ea13dbf0bdd0316e48119ed188b6506d7f8f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7nf98" podUID="25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f" Sep 12 22:53:54.144190 systemd[1]: run-netns-cni\x2dae2bbf9d\x2dd72a\x2d2d30\x2d46a0\x2dbcf06d1dc882.mount: Deactivated successfully. Sep 12 22:53:54.144271 systemd[1]: run-netns-cni\x2d288acaba\x2dfc84\x2d9c9e\x2d8f57\x2d1b3dd6804535.mount: Deactivated successfully. Sep 12 22:53:54.144314 systemd[1]: run-netns-cni\x2d69873bc9\x2d6658\x2dc3f6\x2d75bd\x2d2e8c52973a9d.mount: Deactivated successfully. Sep 12 22:53:58.735336 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1055253367.mount: Deactivated successfully. Sep 12 22:53:58.821684 containerd[1640]: time="2025-09-12T22:53:58.813493320Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:58.824213 containerd[1640]: time="2025-09-12T22:53:58.824198237Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:58.826926 containerd[1640]: time="2025-09-12T22:53:58.826366909Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 22:53:58.827980 containerd[1640]: time="2025-09-12T22:53:58.827962249Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:53:58.829548 containerd[1640]: time="2025-09-12T22:53:58.829528806Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.8061053s" Sep 12 22:53:58.829586 containerd[1640]: time="2025-09-12T22:53:58.829548944Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 22:53:58.842102 containerd[1640]: time="2025-09-12T22:53:58.841930494Z" level=info msg="CreateContainer within sandbox \"47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 22:53:58.875057 containerd[1640]: time="2025-09-12T22:53:58.873913128Z" level=info msg="Container dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:53:58.874996 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2635375031.mount: Deactivated successfully. Sep 12 22:53:58.904709 containerd[1640]: time="2025-09-12T22:53:58.904629126Z" level=info msg="CreateContainer within sandbox \"47a2f1c559ba16647a87cf941a233aaee637761090312ea839659d28b8a3124a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289\"" Sep 12 22:53:58.905025 containerd[1640]: time="2025-09-12T22:53:58.905007438Z" level=info msg="StartContainer for \"dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289\"" Sep 12 22:53:58.909020 containerd[1640]: time="2025-09-12T22:53:58.908994858Z" level=info msg="connecting to shim dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289" address="unix:///run/containerd/s/ff522ef6110611581dc1b0b56e2ba8edad2f1c7b4158e8b6d414d88a19d46d39" protocol=ttrpc version=3 Sep 12 22:53:58.996164 systemd[1]: Started cri-containerd-dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289.scope - libcontainer container dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289. Sep 12 22:53:59.041590 containerd[1640]: time="2025-09-12T22:53:59.041527584Z" level=info msg="StartContainer for \"dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289\" returns successfully" Sep 12 22:53:59.120161 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 22:53:59.123341 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 22:53:59.563683 kubelet[2938]: I0912 22:53:59.563451 2938 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/035f49fb-08d5-4057-94ce-b42b7f9e85b8-whisker-backend-key-pair\") pod \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\" (UID: \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\") " Sep 12 22:53:59.563683 kubelet[2938]: I0912 22:53:59.563478 2938 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035f49fb-08d5-4057-94ce-b42b7f9e85b8-whisker-ca-bundle\") pod \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\" (UID: \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\") " Sep 12 22:53:59.563683 kubelet[2938]: I0912 22:53:59.563493 2938 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx7b9\" (UniqueName: \"kubernetes.io/projected/035f49fb-08d5-4057-94ce-b42b7f9e85b8-kube-api-access-qx7b9\") pod \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\" (UID: \"035f49fb-08d5-4057-94ce-b42b7f9e85b8\") " Sep 12 22:53:59.566127 kubelet[2938]: I0912 22:53:59.566096 2938 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035f49fb-08d5-4057-94ce-b42b7f9e85b8-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "035f49fb-08d5-4057-94ce-b42b7f9e85b8" (UID: "035f49fb-08d5-4057-94ce-b42b7f9e85b8"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 22:53:59.566552 kubelet[2938]: I0912 22:53:59.566536 2938 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035f49fb-08d5-4057-94ce-b42b7f9e85b8-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "035f49fb-08d5-4057-94ce-b42b7f9e85b8" (UID: "035f49fb-08d5-4057-94ce-b42b7f9e85b8"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 22:53:59.566706 kubelet[2938]: I0912 22:53:59.566686 2938 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035f49fb-08d5-4057-94ce-b42b7f9e85b8-kube-api-access-qx7b9" (OuterVolumeSpecName: "kube-api-access-qx7b9") pod "035f49fb-08d5-4057-94ce-b42b7f9e85b8" (UID: "035f49fb-08d5-4057-94ce-b42b7f9e85b8"). InnerVolumeSpecName "kube-api-access-qx7b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 22:53:59.652253 systemd[1]: Removed slice kubepods-besteffort-pod035f49fb_08d5_4057_94ce_b42b7f9e85b8.slice - libcontainer container kubepods-besteffort-pod035f49fb_08d5_4057_94ce_b42b7f9e85b8.slice. Sep 12 22:53:59.664346 kubelet[2938]: I0912 22:53:59.664314 2938 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx7b9\" (UniqueName: \"kubernetes.io/projected/035f49fb-08d5-4057-94ce-b42b7f9e85b8-kube-api-access-qx7b9\") on node \"localhost\" DevicePath \"\"" Sep 12 22:53:59.664346 kubelet[2938]: I0912 22:53:59.664331 2938 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/035f49fb-08d5-4057-94ce-b42b7f9e85b8-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 22:53:59.664346 kubelet[2938]: I0912 22:53:59.664336 2938 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035f49fb-08d5-4057-94ce-b42b7f9e85b8-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 22:53:59.736936 systemd[1]: var-lib-kubelet-pods-035f49fb\x2d08d5\x2d4057\x2d94ce\x2db42b7f9e85b8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqx7b9.mount: Deactivated successfully. Sep 12 22:53:59.736994 systemd[1]: var-lib-kubelet-pods-035f49fb\x2d08d5\x2d4057\x2d94ce\x2db42b7f9e85b8-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 22:54:00.050328 kubelet[2938]: I0912 22:54:00.049856 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-ps95x" podStartSLOduration=2.640531024 podStartE2EDuration="20.0498424s" podCreationTimestamp="2025-09-12 22:53:40 +0000 UTC" firstStartedPulling="2025-09-12 22:53:41.420694655 +0000 UTC m=+20.153199228" lastFinishedPulling="2025-09-12 22:53:58.830006031 +0000 UTC m=+37.562510604" observedRunningTime="2025-09-12 22:54:00.049260735 +0000 UTC m=+38.781765322" watchObservedRunningTime="2025-09-12 22:54:00.0498424 +0000 UTC m=+38.782346990" Sep 12 22:54:00.116452 systemd[1]: Created slice kubepods-besteffort-pod987b1aa3_a530_429e_9ef0_4092dcb8be9f.slice - libcontainer container kubepods-besteffort-pod987b1aa3_a530_429e_9ef0_4092dcb8be9f.slice. Sep 12 22:54:00.188289 containerd[1640]: time="2025-09-12T22:54:00.188251189Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289\" id:\"4871d179893a95c73f7c9825fb8cf65ad4fdcc478cef1e04e3a7b90a1433335c\" pid:4055 exit_status:1 exited_at:{seconds:1757717640 nanos:184986231}" Sep 12 22:54:00.268704 kubelet[2938]: I0912 22:54:00.268678 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/987b1aa3-a530-429e-9ef0-4092dcb8be9f-whisker-backend-key-pair\") pod \"whisker-6f44c4d45f-gmvz4\" (UID: \"987b1aa3-a530-429e-9ef0-4092dcb8be9f\") " pod="calico-system/whisker-6f44c4d45f-gmvz4" Sep 12 22:54:00.268704 kubelet[2938]: I0912 22:54:00.268704 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwg76\" (UniqueName: \"kubernetes.io/projected/987b1aa3-a530-429e-9ef0-4092dcb8be9f-kube-api-access-cwg76\") pod \"whisker-6f44c4d45f-gmvz4\" (UID: \"987b1aa3-a530-429e-9ef0-4092dcb8be9f\") " pod="calico-system/whisker-6f44c4d45f-gmvz4" Sep 12 22:54:00.268850 kubelet[2938]: I0912 22:54:00.268724 2938 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/987b1aa3-a530-429e-9ef0-4092dcb8be9f-whisker-ca-bundle\") pod \"whisker-6f44c4d45f-gmvz4\" (UID: \"987b1aa3-a530-429e-9ef0-4092dcb8be9f\") " pod="calico-system/whisker-6f44c4d45f-gmvz4" Sep 12 22:54:00.420457 containerd[1640]: time="2025-09-12T22:54:00.420380314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f44c4d45f-gmvz4,Uid:987b1aa3-a530-429e-9ef0-4092dcb8be9f,Namespace:calico-system,Attempt:0,}" Sep 12 22:54:01.111012 containerd[1640]: time="2025-09-12T22:54:01.110987499Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289\" id:\"84106361788e323f27dacd529a1595fff9c868842f2a388e1bc18420edafe3d4\" pid:4184 exit_status:1 exited_at:{seconds:1757717641 nanos:110688871}" Sep 12 22:54:01.394333 systemd-networkd[1321]: cali49034ea7add: Link UP Sep 12 22:54:01.394447 systemd-networkd[1321]: cali49034ea7add: Gained carrier Sep 12 22:54:01.406219 containerd[1640]: 2025-09-12 22:54:00.452 [INFO][4070] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:54:01.406219 containerd[1640]: 2025-09-12 22:54:00.987 [INFO][4070] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0 whisker-6f44c4d45f- calico-system 987b1aa3-a530-429e-9ef0-4092dcb8be9f 874 0 2025-09-12 22:54:00 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6f44c4d45f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-6f44c4d45f-gmvz4 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali49034ea7add [] [] }} ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Namespace="calico-system" Pod="whisker-6f44c4d45f-gmvz4" WorkloadEndpoint="localhost-k8s-whisker--6f44c4d45f--gmvz4-" Sep 12 22:54:01.406219 containerd[1640]: 2025-09-12 22:54:00.987 [INFO][4070] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Namespace="calico-system" Pod="whisker-6f44c4d45f-gmvz4" WorkloadEndpoint="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" Sep 12 22:54:01.406219 containerd[1640]: 2025-09-12 22:54:01.340 [INFO][4172] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" HandleID="k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Workload="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.343 [INFO][4172] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" HandleID="k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Workload="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a01a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-6f44c4d45f-gmvz4", "timestamp":"2025-09-12 22:54:01.340584267 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.343 [INFO][4172] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.343 [INFO][4172] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.344 [INFO][4172] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.364 [INFO][4172] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" host="localhost" Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.371 [INFO][4172] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.373 [INFO][4172] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.374 [INFO][4172] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.376 [INFO][4172] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:01.406587 containerd[1640]: 2025-09-12 22:54:01.376 [INFO][4172] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" host="localhost" Sep 12 22:54:01.407448 containerd[1640]: 2025-09-12 22:54:01.376 [INFO][4172] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800 Sep 12 22:54:01.407448 containerd[1640]: 2025-09-12 22:54:01.379 [INFO][4172] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" host="localhost" Sep 12 22:54:01.407448 containerd[1640]: 2025-09-12 22:54:01.381 [INFO][4172] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" host="localhost" Sep 12 22:54:01.407448 containerd[1640]: 2025-09-12 22:54:01.382 [INFO][4172] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" host="localhost" Sep 12 22:54:01.407448 containerd[1640]: 2025-09-12 22:54:01.382 [INFO][4172] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:54:01.407448 containerd[1640]: 2025-09-12 22:54:01.382 [INFO][4172] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" HandleID="k8s-pod-network.2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Workload="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" Sep 12 22:54:01.407601 containerd[1640]: 2025-09-12 22:54:01.383 [INFO][4070] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Namespace="calico-system" Pod="whisker-6f44c4d45f-gmvz4" WorkloadEndpoint="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0", GenerateName:"whisker-6f44c4d45f-", Namespace:"calico-system", SelfLink:"", UID:"987b1aa3-a530-429e-9ef0-4092dcb8be9f", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 54, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f44c4d45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-6f44c4d45f-gmvz4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali49034ea7add", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:01.407601 containerd[1640]: 2025-09-12 22:54:01.383 [INFO][4070] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Namespace="calico-system" Pod="whisker-6f44c4d45f-gmvz4" WorkloadEndpoint="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" Sep 12 22:54:01.407698 containerd[1640]: 2025-09-12 22:54:01.383 [INFO][4070] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali49034ea7add ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Namespace="calico-system" Pod="whisker-6f44c4d45f-gmvz4" WorkloadEndpoint="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" Sep 12 22:54:01.407698 containerd[1640]: 2025-09-12 22:54:01.395 [INFO][4070] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Namespace="calico-system" Pod="whisker-6f44c4d45f-gmvz4" WorkloadEndpoint="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" Sep 12 22:54:01.407813 containerd[1640]: 2025-09-12 22:54:01.395 [INFO][4070] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Namespace="calico-system" Pod="whisker-6f44c4d45f-gmvz4" WorkloadEndpoint="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0", GenerateName:"whisker-6f44c4d45f-", Namespace:"calico-system", SelfLink:"", UID:"987b1aa3-a530-429e-9ef0-4092dcb8be9f", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 54, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f44c4d45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800", Pod:"whisker-6f44c4d45f-gmvz4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali49034ea7add", MAC:"1e:e5:b0:03:f1:39", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:01.407884 containerd[1640]: 2025-09-12 22:54:01.404 [INFO][4070] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" Namespace="calico-system" Pod="whisker-6f44c4d45f-gmvz4" WorkloadEndpoint="localhost-k8s-whisker--6f44c4d45f--gmvz4-eth0" Sep 12 22:54:01.500497 containerd[1640]: time="2025-09-12T22:54:01.500442456Z" level=info msg="connecting to shim 2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800" address="unix:///run/containerd/s/54bfa1b92d41183ae1cb924c2a06eaac38e7f294e5bea6177a7242f815558818" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:54:01.520129 systemd[1]: Started cri-containerd-2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800.scope - libcontainer container 2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800. Sep 12 22:54:01.527874 systemd-resolved[1560]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:54:01.557893 containerd[1640]: time="2025-09-12T22:54:01.557859686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f44c4d45f-gmvz4,Uid:987b1aa3-a530-429e-9ef0-4092dcb8be9f,Namespace:calico-system,Attempt:0,} returns sandbox id \"2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800\"" Sep 12 22:54:01.560573 containerd[1640]: time="2025-09-12T22:54:01.560552552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 22:54:01.649851 kubelet[2938]: I0912 22:54:01.649403 2938 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035f49fb-08d5-4057-94ce-b42b7f9e85b8" path="/var/lib/kubelet/pods/035f49fb-08d5-4057-94ce-b42b7f9e85b8/volumes" Sep 12 22:54:02.591783 systemd-networkd[1321]: cali49034ea7add: Gained IPv6LL Sep 12 22:54:02.811263 containerd[1640]: time="2025-09-12T22:54:02.811230636Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:02.811676 containerd[1640]: time="2025-09-12T22:54:02.811619110Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 22:54:02.811939 containerd[1640]: time="2025-09-12T22:54:02.811885813Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:02.813028 containerd[1640]: time="2025-09-12T22:54:02.813013691Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:02.813570 containerd[1640]: time="2025-09-12T22:54:02.813535206Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.252881824s" Sep 12 22:54:02.813570 containerd[1640]: time="2025-09-12T22:54:02.813554136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 22:54:02.814780 containerd[1640]: time="2025-09-12T22:54:02.814765510Z" level=info msg="CreateContainer within sandbox \"2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 22:54:02.821919 containerd[1640]: time="2025-09-12T22:54:02.821524023Z" level=info msg="Container d18df42bc063156816989c6b9a32f00ebfae1b365e13fe189046e594532d1085: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:02.823708 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1155297774.mount: Deactivated successfully. Sep 12 22:54:02.829279 containerd[1640]: time="2025-09-12T22:54:02.829255116Z" level=info msg="CreateContainer within sandbox \"2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d18df42bc063156816989c6b9a32f00ebfae1b365e13fe189046e594532d1085\"" Sep 12 22:54:02.830062 containerd[1640]: time="2025-09-12T22:54:02.829903737Z" level=info msg="StartContainer for \"d18df42bc063156816989c6b9a32f00ebfae1b365e13fe189046e594532d1085\"" Sep 12 22:54:02.830755 containerd[1640]: time="2025-09-12T22:54:02.830738890Z" level=info msg="connecting to shim d18df42bc063156816989c6b9a32f00ebfae1b365e13fe189046e594532d1085" address="unix:///run/containerd/s/54bfa1b92d41183ae1cb924c2a06eaac38e7f294e5bea6177a7242f815558818" protocol=ttrpc version=3 Sep 12 22:54:02.846146 systemd[1]: Started cri-containerd-d18df42bc063156816989c6b9a32f00ebfae1b365e13fe189046e594532d1085.scope - libcontainer container d18df42bc063156816989c6b9a32f00ebfae1b365e13fe189046e594532d1085. Sep 12 22:54:02.885624 containerd[1640]: time="2025-09-12T22:54:02.885598929Z" level=info msg="StartContainer for \"d18df42bc063156816989c6b9a32f00ebfae1b365e13fe189046e594532d1085\" returns successfully" Sep 12 22:54:02.886673 containerd[1640]: time="2025-09-12T22:54:02.886621948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 22:54:03.728510 containerd[1640]: time="2025-09-12T22:54:03.728482186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57c58cd6b4-k9wjk,Uid:c5490633-0f26-4c96-a32c-8a889e99267c,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:54:03.851228 systemd-networkd[1321]: calibdb09e24010: Link UP Sep 12 22:54:03.851711 systemd-networkd[1321]: calibdb09e24010: Gained carrier Sep 12 22:54:03.863282 containerd[1640]: 2025-09-12 22:54:03.769 [INFO][4333] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:54:03.863282 containerd[1640]: 2025-09-12 22:54:03.798 [INFO][4333] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0 calico-apiserver-57c58cd6b4- calico-apiserver c5490633-0f26-4c96-a32c-8a889e99267c 811 0 2025-09-12 22:53:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57c58cd6b4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-57c58cd6b4-k9wjk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calibdb09e24010 [] [] }} ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-k9wjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-" Sep 12 22:54:03.863282 containerd[1640]: 2025-09-12 22:54:03.798 [INFO][4333] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-k9wjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" Sep 12 22:54:03.863282 containerd[1640]: 2025-09-12 22:54:03.827 [INFO][4345] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" HandleID="k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Workload="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.827 [INFO][4345] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" HandleID="k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Workload="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-57c58cd6b4-k9wjk", "timestamp":"2025-09-12 22:54:03.827334075 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.827 [INFO][4345] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.827 [INFO][4345] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.827 [INFO][4345] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.833 [INFO][4345] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" host="localhost" Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.836 [INFO][4345] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.839 [INFO][4345] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.840 [INFO][4345] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.841 [INFO][4345] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:03.863674 containerd[1640]: 2025-09-12 22:54:03.841 [INFO][4345] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" host="localhost" Sep 12 22:54:03.865529 containerd[1640]: 2025-09-12 22:54:03.842 [INFO][4345] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455 Sep 12 22:54:03.865529 containerd[1640]: 2025-09-12 22:54:03.844 [INFO][4345] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" host="localhost" Sep 12 22:54:03.865529 containerd[1640]: 2025-09-12 22:54:03.846 [INFO][4345] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" host="localhost" Sep 12 22:54:03.865529 containerd[1640]: 2025-09-12 22:54:03.847 [INFO][4345] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" host="localhost" Sep 12 22:54:03.865529 containerd[1640]: 2025-09-12 22:54:03.847 [INFO][4345] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:54:03.865529 containerd[1640]: 2025-09-12 22:54:03.847 [INFO][4345] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" HandleID="k8s-pod-network.238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Workload="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" Sep 12 22:54:03.865679 containerd[1640]: 2025-09-12 22:54:03.849 [INFO][4333] cni-plugin/k8s.go 418: Populated endpoint ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-k9wjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0", GenerateName:"calico-apiserver-57c58cd6b4-", Namespace:"calico-apiserver", SelfLink:"", UID:"c5490633-0f26-4c96-a32c-8a889e99267c", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57c58cd6b4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-57c58cd6b4-k9wjk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibdb09e24010", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:03.865730 containerd[1640]: 2025-09-12 22:54:03.849 [INFO][4333] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-k9wjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" Sep 12 22:54:03.865730 containerd[1640]: 2025-09-12 22:54:03.849 [INFO][4333] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibdb09e24010 ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-k9wjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" Sep 12 22:54:03.865730 containerd[1640]: 2025-09-12 22:54:03.852 [INFO][4333] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-k9wjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" Sep 12 22:54:03.867119 containerd[1640]: 2025-09-12 22:54:03.852 [INFO][4333] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-k9wjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0", GenerateName:"calico-apiserver-57c58cd6b4-", Namespace:"calico-apiserver", SelfLink:"", UID:"c5490633-0f26-4c96-a32c-8a889e99267c", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57c58cd6b4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455", Pod:"calico-apiserver-57c58cd6b4-k9wjk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibdb09e24010", MAC:"a2:a3:fd:eb:0d:a7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:03.867172 containerd[1640]: 2025-09-12 22:54:03.860 [INFO][4333] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-k9wjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--k9wjk-eth0" Sep 12 22:54:03.889385 containerd[1640]: time="2025-09-12T22:54:03.889345585Z" level=info msg="connecting to shim 238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455" address="unix:///run/containerd/s/aab51383ea1cb36f2b1060f92391a01477357312e3dfda66e26d7685c28c8dae" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:54:03.906131 systemd[1]: Started cri-containerd-238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455.scope - libcontainer container 238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455. Sep 12 22:54:03.914883 systemd-resolved[1560]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:54:03.942641 containerd[1640]: time="2025-09-12T22:54:03.942616372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57c58cd6b4-k9wjk,Uid:c5490633-0f26-4c96-a32c-8a889e99267c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455\"" Sep 12 22:54:04.646508 containerd[1640]: time="2025-09-12T22:54:04.646480215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57c58cd6b4-bbb9r,Uid:8e984712-7649-4b52-afb2-10ca87cddfeb,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:54:04.785145 systemd-networkd[1321]: cali12592886f78: Link UP Sep 12 22:54:04.785842 systemd-networkd[1321]: cali12592886f78: Gained carrier Sep 12 22:54:04.799131 containerd[1640]: 2025-09-12 22:54:04.684 [INFO][4429] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:54:04.799131 containerd[1640]: 2025-09-12 22:54:04.696 [INFO][4429] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0 calico-apiserver-57c58cd6b4- calico-apiserver 8e984712-7649-4b52-afb2-10ca87cddfeb 813 0 2025-09-12 22:53:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57c58cd6b4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-57c58cd6b4-bbb9r eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali12592886f78 [] [] }} ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-bbb9r" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-" Sep 12 22:54:04.799131 containerd[1640]: 2025-09-12 22:54:04.696 [INFO][4429] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-bbb9r" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" Sep 12 22:54:04.799131 containerd[1640]: 2025-09-12 22:54:04.738 [INFO][4443] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" HandleID="k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Workload="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.738 [INFO][4443] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" HandleID="k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Workload="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-57c58cd6b4-bbb9r", "timestamp":"2025-09-12 22:54:04.738708363 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.738 [INFO][4443] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.739 [INFO][4443] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.739 [INFO][4443] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.748 [INFO][4443] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" host="localhost" Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.751 [INFO][4443] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.759 [INFO][4443] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.760 [INFO][4443] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.762 [INFO][4443] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:04.799727 containerd[1640]: 2025-09-12 22:54:04.762 [INFO][4443] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" host="localhost" Sep 12 22:54:04.800023 containerd[1640]: 2025-09-12 22:54:04.763 [INFO][4443] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745 Sep 12 22:54:04.800023 containerd[1640]: 2025-09-12 22:54:04.772 [INFO][4443] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" host="localhost" Sep 12 22:54:04.800023 containerd[1640]: 2025-09-12 22:54:04.777 [INFO][4443] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" host="localhost" Sep 12 22:54:04.800023 containerd[1640]: 2025-09-12 22:54:04.777 [INFO][4443] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" host="localhost" Sep 12 22:54:04.800023 containerd[1640]: 2025-09-12 22:54:04.777 [INFO][4443] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:54:04.800023 containerd[1640]: 2025-09-12 22:54:04.777 [INFO][4443] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" HandleID="k8s-pod-network.c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Workload="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" Sep 12 22:54:04.800384 containerd[1640]: 2025-09-12 22:54:04.780 [INFO][4429] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-bbb9r" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0", GenerateName:"calico-apiserver-57c58cd6b4-", Namespace:"calico-apiserver", SelfLink:"", UID:"8e984712-7649-4b52-afb2-10ca87cddfeb", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57c58cd6b4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-57c58cd6b4-bbb9r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali12592886f78", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:04.800436 containerd[1640]: 2025-09-12 22:54:04.781 [INFO][4429] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-bbb9r" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" Sep 12 22:54:04.800436 containerd[1640]: 2025-09-12 22:54:04.781 [INFO][4429] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali12592886f78 ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-bbb9r" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" Sep 12 22:54:04.800436 containerd[1640]: 2025-09-12 22:54:04.786 [INFO][4429] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-bbb9r" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" Sep 12 22:54:04.800977 containerd[1640]: 2025-09-12 22:54:04.786 [INFO][4429] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-bbb9r" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0", GenerateName:"calico-apiserver-57c58cd6b4-", Namespace:"calico-apiserver", SelfLink:"", UID:"8e984712-7649-4b52-afb2-10ca87cddfeb", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57c58cd6b4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745", Pod:"calico-apiserver-57c58cd6b4-bbb9r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali12592886f78", MAC:"5a:06:d5:f3:7c:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:04.802166 containerd[1640]: 2025-09-12 22:54:04.796 [INFO][4429] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" Namespace="calico-apiserver" Pod="calico-apiserver-57c58cd6b4-bbb9r" WorkloadEndpoint="localhost-k8s-calico--apiserver--57c58cd6b4--bbb9r-eth0" Sep 12 22:54:04.820695 containerd[1640]: time="2025-09-12T22:54:04.820625777Z" level=info msg="connecting to shim c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745" address="unix:///run/containerd/s/e383898855dfd8728e65a241f74b6a5504c895f066005a734f7e8017376b9c8b" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:54:04.840285 systemd[1]: Started cri-containerd-c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745.scope - libcontainer container c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745. Sep 12 22:54:04.849868 systemd-resolved[1560]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:54:04.890453 containerd[1640]: time="2025-09-12T22:54:04.890432523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57c58cd6b4-bbb9r,Uid:8e984712-7649-4b52-afb2-10ca87cddfeb,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745\"" Sep 12 22:54:04.932741 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2810429158.mount: Deactivated successfully. Sep 12 22:54:04.939553 containerd[1640]: time="2025-09-12T22:54:04.939516131Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:04.939940 containerd[1640]: time="2025-09-12T22:54:04.939922912Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 22:54:04.940302 containerd[1640]: time="2025-09-12T22:54:04.940129879Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:04.941380 containerd[1640]: time="2025-09-12T22:54:04.941363405Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:04.942074 containerd[1640]: time="2025-09-12T22:54:04.942024070Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.055383522s" Sep 12 22:54:04.942074 containerd[1640]: time="2025-09-12T22:54:04.942052161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 22:54:04.943936 containerd[1640]: time="2025-09-12T22:54:04.943916426Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:54:04.945176 containerd[1640]: time="2025-09-12T22:54:04.945154924Z" level=info msg="CreateContainer within sandbox \"2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 22:54:04.951268 containerd[1640]: time="2025-09-12T22:54:04.950804580Z" level=info msg="Container 92444dab44f334e0768c12c81ea31bcf7816a41e3410224285c1a9d61063d79e: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:04.967214 containerd[1640]: time="2025-09-12T22:54:04.967191540Z" level=info msg="CreateContainer within sandbox \"2063b01dc1d005795ec6bed8fb49f5e10e2a20a56a195446caf18d73db95d800\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"92444dab44f334e0768c12c81ea31bcf7816a41e3410224285c1a9d61063d79e\"" Sep 12 22:54:04.967650 containerd[1640]: time="2025-09-12T22:54:04.967527842Z" level=info msg="StartContainer for \"92444dab44f334e0768c12c81ea31bcf7816a41e3410224285c1a9d61063d79e\"" Sep 12 22:54:04.968569 containerd[1640]: time="2025-09-12T22:54:04.968503372Z" level=info msg="connecting to shim 92444dab44f334e0768c12c81ea31bcf7816a41e3410224285c1a9d61063d79e" address="unix:///run/containerd/s/54bfa1b92d41183ae1cb924c2a06eaac38e7f294e5bea6177a7242f815558818" protocol=ttrpc version=3 Sep 12 22:54:04.988286 systemd[1]: Started cri-containerd-92444dab44f334e0768c12c81ea31bcf7816a41e3410224285c1a9d61063d79e.scope - libcontainer container 92444dab44f334e0768c12c81ea31bcf7816a41e3410224285c1a9d61063d79e. Sep 12 22:54:05.031263 containerd[1640]: time="2025-09-12T22:54:05.031230976Z" level=info msg="StartContainer for \"92444dab44f334e0768c12c81ea31bcf7816a41e3410224285c1a9d61063d79e\" returns successfully" Sep 12 22:54:05.111955 kubelet[2938]: I0912 22:54:05.111915 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6f44c4d45f-gmvz4" podStartSLOduration=1.727251275 podStartE2EDuration="5.111897135s" podCreationTimestamp="2025-09-12 22:54:00 +0000 UTC" firstStartedPulling="2025-09-12 22:54:01.558695939 +0000 UTC m=+40.291200512" lastFinishedPulling="2025-09-12 22:54:04.943341799 +0000 UTC m=+43.675846372" observedRunningTime="2025-09-12 22:54:05.111457748 +0000 UTC m=+43.843962322" watchObservedRunningTime="2025-09-12 22:54:05.111897135 +0000 UTC m=+43.844401710" Sep 12 22:54:05.150122 systemd-networkd[1321]: calibdb09e24010: Gained IPv6LL Sep 12 22:54:05.646438 containerd[1640]: time="2025-09-12T22:54:05.646129582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2hz5l,Uid:ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d,Namespace:kube-system,Attempt:0,}" Sep 12 22:54:05.685578 containerd[1640]: time="2025-09-12T22:54:05.685378361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-vzktr,Uid:1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c,Namespace:calico-system,Attempt:0,}" Sep 12 22:54:05.685578 containerd[1640]: time="2025-09-12T22:54:05.685406682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8nsqz,Uid:2f3a9207-5186-4087-a436-20d5942dd273,Namespace:kube-system,Attempt:0,}" Sep 12 22:54:05.786864 systemd-networkd[1321]: calib504ca6f309: Link UP Sep 12 22:54:05.787423 systemd-networkd[1321]: calib504ca6f309: Gained carrier Sep 12 22:54:05.796918 containerd[1640]: 2025-09-12 22:54:05.683 [INFO][4556] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:54:05.796918 containerd[1640]: 2025-09-12 22:54:05.701 [INFO][4556] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0 coredns-7c65d6cfc9- kube-system ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d 808 0 2025-09-12 22:53:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-2hz5l eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib504ca6f309 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2hz5l" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2hz5l-" Sep 12 22:54:05.796918 containerd[1640]: 2025-09-12 22:54:05.701 [INFO][4556] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2hz5l" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" Sep 12 22:54:05.796918 containerd[1640]: 2025-09-12 22:54:05.738 [INFO][4589] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" HandleID="k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Workload="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.738 [INFO][4589] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" HandleID="k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Workload="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b4b0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-2hz5l", "timestamp":"2025-09-12 22:54:05.738027881 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.738 [INFO][4589] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.738 [INFO][4589] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.738 [INFO][4589] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.748 [INFO][4589] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" host="localhost" Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.753 [INFO][4589] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.769 [INFO][4589] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.772 [INFO][4589] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.774 [INFO][4589] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:05.797635 containerd[1640]: 2025-09-12 22:54:05.774 [INFO][4589] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" host="localhost" Sep 12 22:54:05.797990 containerd[1640]: 2025-09-12 22:54:05.775 [INFO][4589] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca Sep 12 22:54:05.797990 containerd[1640]: 2025-09-12 22:54:05.777 [INFO][4589] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" host="localhost" Sep 12 22:54:05.797990 containerd[1640]: 2025-09-12 22:54:05.781 [INFO][4589] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" host="localhost" Sep 12 22:54:05.797990 containerd[1640]: 2025-09-12 22:54:05.781 [INFO][4589] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" host="localhost" Sep 12 22:54:05.797990 containerd[1640]: 2025-09-12 22:54:05.781 [INFO][4589] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:54:05.797990 containerd[1640]: 2025-09-12 22:54:05.781 [INFO][4589] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" HandleID="k8s-pod-network.2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Workload="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" Sep 12 22:54:05.798383 containerd[1640]: 2025-09-12 22:54:05.782 [INFO][4556] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2hz5l" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-2hz5l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib504ca6f309", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:05.798444 containerd[1640]: 2025-09-12 22:54:05.782 [INFO][4556] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2hz5l" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" Sep 12 22:54:05.798444 containerd[1640]: 2025-09-12 22:54:05.782 [INFO][4556] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib504ca6f309 ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2hz5l" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" Sep 12 22:54:05.798444 containerd[1640]: 2025-09-12 22:54:05.787 [INFO][4556] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2hz5l" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" Sep 12 22:54:05.798498 containerd[1640]: 2025-09-12 22:54:05.787 [INFO][4556] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2hz5l" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca", Pod:"coredns-7c65d6cfc9-2hz5l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib504ca6f309", MAC:"b6:fc:4a:20:28:cf", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:05.798498 containerd[1640]: 2025-09-12 22:54:05.793 [INFO][4556] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2hz5l" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2hz5l-eth0" Sep 12 22:54:05.812601 containerd[1640]: time="2025-09-12T22:54:05.812565371Z" level=info msg="connecting to shim 2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca" address="unix:///run/containerd/s/fd2e0de6a0aca789b59fcf70864ffbdd0a12f72106916c16a40e030050274871" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:54:05.862234 systemd[1]: Started cri-containerd-2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca.scope - libcontainer container 2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca. Sep 12 22:54:05.874752 systemd-resolved[1560]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:54:05.890761 systemd-networkd[1321]: cali755a5e7e8b9: Link UP Sep 12 22:54:05.890861 systemd-networkd[1321]: cali755a5e7e8b9: Gained carrier Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.728 [INFO][4567] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.739 [INFO][4567] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0 coredns-7c65d6cfc9- kube-system 2f3a9207-5186-4087-a436-20d5942dd273 810 0 2025-09-12 22:53:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-8nsqz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali755a5e7e8b9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8nsqz" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8nsqz-" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.739 [INFO][4567] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8nsqz" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.767 [INFO][4601] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" HandleID="k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Workload="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.769 [INFO][4601] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" HandleID="k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Workload="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5950), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-8nsqz", "timestamp":"2025-09-12 22:54:05.767823898 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.769 [INFO][4601] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.781 [INFO][4601] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.781 [INFO][4601] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.848 [INFO][4601] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.853 [INFO][4601] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.860 [INFO][4601] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.861 [INFO][4601] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.863 [INFO][4601] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.863 [INFO][4601] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.867 [INFO][4601] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.873 [INFO][4601] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.880 [INFO][4601] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.881 [INFO][4601] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" host="localhost" Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.881 [INFO][4601] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:54:05.902378 containerd[1640]: 2025-09-12 22:54:05.881 [INFO][4601] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" HandleID="k8s-pod-network.d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Workload="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" Sep 12 22:54:05.904629 containerd[1640]: 2025-09-12 22:54:05.884 [INFO][4567] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8nsqz" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"2f3a9207-5186-4087-a436-20d5942dd273", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-8nsqz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali755a5e7e8b9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:05.904629 containerd[1640]: 2025-09-12 22:54:05.884 [INFO][4567] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8nsqz" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" Sep 12 22:54:05.904629 containerd[1640]: 2025-09-12 22:54:05.884 [INFO][4567] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali755a5e7e8b9 ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8nsqz" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" Sep 12 22:54:05.904629 containerd[1640]: 2025-09-12 22:54:05.889 [INFO][4567] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8nsqz" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" Sep 12 22:54:05.904629 containerd[1640]: 2025-09-12 22:54:05.889 [INFO][4567] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8nsqz" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"2f3a9207-5186-4087-a436-20d5942dd273", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c", Pod:"coredns-7c65d6cfc9-8nsqz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali755a5e7e8b9", MAC:"02:c8:03:4c:6f:e9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:05.904629 containerd[1640]: 2025-09-12 22:54:05.897 [INFO][4567] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8nsqz" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8nsqz-eth0" Sep 12 22:54:05.917570 containerd[1640]: time="2025-09-12T22:54:05.917546861Z" level=info msg="connecting to shim d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c" address="unix:///run/containerd/s/1eda6cb6c083d86094fbb882214cc1231c52ad01e9c8d3bf7587f3d422f7b255" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:54:05.931255 containerd[1640]: time="2025-09-12T22:54:05.931235794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2hz5l,Uid:ad3f9ce4-1c91-4cc6-92be-f6e9a6ca4e2d,Namespace:kube-system,Attempt:0,} returns sandbox id \"2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca\"" Sep 12 22:54:05.933586 containerd[1640]: time="2025-09-12T22:54:05.933243238Z" level=info msg="CreateContainer within sandbox \"2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:54:05.946850 containerd[1640]: time="2025-09-12T22:54:05.946791202Z" level=info msg="Container e596b74a20a9cfc2cda14451cf94af54ab2b48944d18bec19a8e0b729f1fb0a4: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:05.947221 systemd[1]: Started cri-containerd-d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c.scope - libcontainer container d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c. Sep 12 22:54:05.948981 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4074265832.mount: Deactivated successfully. Sep 12 22:54:05.955312 containerd[1640]: time="2025-09-12T22:54:05.955287009Z" level=info msg="CreateContainer within sandbox \"2fea74f1a8be79812193d9d9ca3bc2d4033643e33e0f1350a86dc0dc9d147bca\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e596b74a20a9cfc2cda14451cf94af54ab2b48944d18bec19a8e0b729f1fb0a4\"" Sep 12 22:54:05.955701 containerd[1640]: time="2025-09-12T22:54:05.955677242Z" level=info msg="StartContainer for \"e596b74a20a9cfc2cda14451cf94af54ab2b48944d18bec19a8e0b729f1fb0a4\"" Sep 12 22:54:05.956671 containerd[1640]: time="2025-09-12T22:54:05.956651917Z" level=info msg="connecting to shim e596b74a20a9cfc2cda14451cf94af54ab2b48944d18bec19a8e0b729f1fb0a4" address="unix:///run/containerd/s/fd2e0de6a0aca789b59fcf70864ffbdd0a12f72106916c16a40e030050274871" protocol=ttrpc version=3 Sep 12 22:54:05.966458 systemd-resolved[1560]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:54:05.982380 systemd[1]: Started cri-containerd-e596b74a20a9cfc2cda14451cf94af54ab2b48944d18bec19a8e0b729f1fb0a4.scope - libcontainer container e596b74a20a9cfc2cda14451cf94af54ab2b48944d18bec19a8e0b729f1fb0a4. Sep 12 22:54:05.992747 systemd-networkd[1321]: cali5a9a40d10cd: Link UP Sep 12 22:54:05.993645 systemd-networkd[1321]: cali5a9a40d10cd: Gained carrier Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.728 [INFO][4571] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.744 [INFO][4571] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--vzktr-eth0 goldmane-7988f88666- calico-system 1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c 807 0 2025-09-12 22:53:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-vzktr eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5a9a40d10cd [] [] }} ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Namespace="calico-system" Pod="goldmane-7988f88666-vzktr" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--vzktr-" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.745 [INFO][4571] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Namespace="calico-system" Pod="goldmane-7988f88666-vzktr" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--vzktr-eth0" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.771 [INFO][4606] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" HandleID="k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Workload="localhost-k8s-goldmane--7988f88666--vzktr-eth0" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.771 [INFO][4606] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" HandleID="k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Workload="localhost-k8s-goldmane--7988f88666--vzktr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5010), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-vzktr", "timestamp":"2025-09-12 22:54:05.771666569 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.771 [INFO][4606] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.881 [INFO][4606] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.881 [INFO][4606] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.948 [INFO][4606] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.953 [INFO][4606] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.968 [INFO][4606] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.971 [INFO][4606] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.974 [INFO][4606] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.974 [INFO][4606] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.976 [INFO][4606] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6 Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.980 [INFO][4606] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.985 [INFO][4606] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.985 [INFO][4606] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" host="localhost" Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.985 [INFO][4606] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:54:06.013356 containerd[1640]: 2025-09-12 22:54:05.985 [INFO][4606] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" HandleID="k8s-pod-network.764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Workload="localhost-k8s-goldmane--7988f88666--vzktr-eth0" Sep 12 22:54:06.014327 containerd[1640]: 2025-09-12 22:54:05.988 [INFO][4571] cni-plugin/k8s.go 418: Populated endpoint ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Namespace="calico-system" Pod="goldmane-7988f88666-vzktr" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--vzktr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--vzktr-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-vzktr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5a9a40d10cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:06.014327 containerd[1640]: 2025-09-12 22:54:05.988 [INFO][4571] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Namespace="calico-system" Pod="goldmane-7988f88666-vzktr" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--vzktr-eth0" Sep 12 22:54:06.014327 containerd[1640]: 2025-09-12 22:54:05.988 [INFO][4571] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5a9a40d10cd ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Namespace="calico-system" Pod="goldmane-7988f88666-vzktr" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--vzktr-eth0" Sep 12 22:54:06.014327 containerd[1640]: 2025-09-12 22:54:05.994 [INFO][4571] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Namespace="calico-system" Pod="goldmane-7988f88666-vzktr" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--vzktr-eth0" Sep 12 22:54:06.014327 containerd[1640]: 2025-09-12 22:54:05.994 [INFO][4571] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Namespace="calico-system" Pod="goldmane-7988f88666-vzktr" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--vzktr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--vzktr-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6", Pod:"goldmane-7988f88666-vzktr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5a9a40d10cd", MAC:"d2:32:34:3f:78:44", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:06.014327 containerd[1640]: 2025-09-12 22:54:06.007 [INFO][4571] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" Namespace="calico-system" Pod="goldmane-7988f88666-vzktr" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--vzktr-eth0" Sep 12 22:54:06.047528 containerd[1640]: time="2025-09-12T22:54:06.047495073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8nsqz,Uid:2f3a9207-5186-4087-a436-20d5942dd273,Namespace:kube-system,Attempt:0,} returns sandbox id \"d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c\"" Sep 12 22:54:06.048346 containerd[1640]: time="2025-09-12T22:54:06.048330417Z" level=info msg="StartContainer for \"e596b74a20a9cfc2cda14451cf94af54ab2b48944d18bec19a8e0b729f1fb0a4\" returns successfully" Sep 12 22:54:06.049824 containerd[1640]: time="2025-09-12T22:54:06.049804940Z" level=info msg="CreateContainer within sandbox \"d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:54:06.071900 containerd[1640]: time="2025-09-12T22:54:06.071402589Z" level=info msg="Container 5d3b22a29741593d63fce87da8618030d8305bf7ad0dc55820a20313f3717677: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:06.077912 containerd[1640]: time="2025-09-12T22:54:06.077881973Z" level=info msg="connecting to shim 764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6" address="unix:///run/containerd/s/b4e27d0179dd670934ed95065254630e3d56bf2337a611b3b321bf145313d85d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:54:06.080420 containerd[1640]: time="2025-09-12T22:54:06.080007903Z" level=info msg="CreateContainer within sandbox \"d7af6cd3aa7826d64e119042aedb30dd7c8c28861a7c74197599d3f0937bd92c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5d3b22a29741593d63fce87da8618030d8305bf7ad0dc55820a20313f3717677\"" Sep 12 22:54:06.082221 containerd[1640]: time="2025-09-12T22:54:06.082203097Z" level=info msg="StartContainer for \"5d3b22a29741593d63fce87da8618030d8305bf7ad0dc55820a20313f3717677\"" Sep 12 22:54:06.085057 containerd[1640]: time="2025-09-12T22:54:06.085003214Z" level=info msg="connecting to shim 5d3b22a29741593d63fce87da8618030d8305bf7ad0dc55820a20313f3717677" address="unix:///run/containerd/s/1eda6cb6c083d86094fbb882214cc1231c52ad01e9c8d3bf7587f3d422f7b255" protocol=ttrpc version=3 Sep 12 22:54:06.100189 systemd[1]: Started cri-containerd-764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6.scope - libcontainer container 764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6. Sep 12 22:54:06.109190 systemd[1]: Started cri-containerd-5d3b22a29741593d63fce87da8618030d8305bf7ad0dc55820a20313f3717677.scope - libcontainer container 5d3b22a29741593d63fce87da8618030d8305bf7ad0dc55820a20313f3717677. Sep 12 22:54:06.117684 systemd-resolved[1560]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:54:06.137966 containerd[1640]: time="2025-09-12T22:54:06.137892773Z" level=info msg="StartContainer for \"5d3b22a29741593d63fce87da8618030d8305bf7ad0dc55820a20313f3717677\" returns successfully" Sep 12 22:54:06.161188 containerd[1640]: time="2025-09-12T22:54:06.161117969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-vzktr,Uid:1a47fb8c-98ea-48aa-b27d-2fd1d5e34b1c,Namespace:calico-system,Attempt:0,} returns sandbox id \"764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6\"" Sep 12 22:54:06.814181 systemd-networkd[1321]: cali12592886f78: Gained IPv6LL Sep 12 22:54:07.094108 kubelet[2938]: I0912 22:54:07.074379 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-2hz5l" podStartSLOduration=39.074363128 podStartE2EDuration="39.074363128s" podCreationTimestamp="2025-09-12 22:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:54:06.093398434 +0000 UTC m=+44.825903017" watchObservedRunningTime="2025-09-12 22:54:07.074363128 +0000 UTC m=+45.806867705" Sep 12 22:54:07.094108 kubelet[2938]: I0912 22:54:07.093628 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-8nsqz" podStartSLOduration=39.093617235 podStartE2EDuration="39.093617235s" podCreationTimestamp="2025-09-12 22:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:54:07.093596088 +0000 UTC m=+45.826100662" watchObservedRunningTime="2025-09-12 22:54:07.093617235 +0000 UTC m=+45.826121828" Sep 12 22:54:07.518145 systemd-networkd[1321]: cali755a5e7e8b9: Gained IPv6LL Sep 12 22:54:07.532370 containerd[1640]: time="2025-09-12T22:54:07.531903508Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:07.534573 containerd[1640]: time="2025-09-12T22:54:07.533254889Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 22:54:07.540416 containerd[1640]: time="2025-09-12T22:54:07.533996714Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:07.540520 containerd[1640]: time="2025-09-12T22:54:07.536657875Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.59258622s" Sep 12 22:54:07.540520 containerd[1640]: time="2025-09-12T22:54:07.540468136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 22:54:07.541793 containerd[1640]: time="2025-09-12T22:54:07.541448081Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:07.542438 containerd[1640]: time="2025-09-12T22:54:07.542421252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:54:07.544133 containerd[1640]: time="2025-09-12T22:54:07.543876699Z" level=info msg="CreateContainer within sandbox \"238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:54:07.553056 containerd[1640]: time="2025-09-12T22:54:07.551103459Z" level=info msg="Container 016ed7afac09734f413dd3769556eea421f6bdc0d0db85b7b781d7b3ec141787: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:07.571305 containerd[1640]: time="2025-09-12T22:54:07.571278883Z" level=info msg="CreateContainer within sandbox \"238cb7098dbcc6bdfc4a51981df43ff5ce6aac3c8be5bf976ce9002b4b6a0455\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"016ed7afac09734f413dd3769556eea421f6bdc0d0db85b7b781d7b3ec141787\"" Sep 12 22:54:07.572142 containerd[1640]: time="2025-09-12T22:54:07.572125456Z" level=info msg="StartContainer for \"016ed7afac09734f413dd3769556eea421f6bdc0d0db85b7b781d7b3ec141787\"" Sep 12 22:54:07.572707 containerd[1640]: time="2025-09-12T22:54:07.572692477Z" level=info msg="connecting to shim 016ed7afac09734f413dd3769556eea421f6bdc0d0db85b7b781d7b3ec141787" address="unix:///run/containerd/s/aab51383ea1cb36f2b1060f92391a01477357312e3dfda66e26d7685c28c8dae" protocol=ttrpc version=3 Sep 12 22:54:07.599156 systemd[1]: Started cri-containerd-016ed7afac09734f413dd3769556eea421f6bdc0d0db85b7b781d7b3ec141787.scope - libcontainer container 016ed7afac09734f413dd3769556eea421f6bdc0d0db85b7b781d7b3ec141787. Sep 12 22:54:07.646258 systemd-networkd[1321]: calib504ca6f309: Gained IPv6LL Sep 12 22:54:07.646439 systemd-networkd[1321]: cali5a9a40d10cd: Gained IPv6LL Sep 12 22:54:07.650825 containerd[1640]: time="2025-09-12T22:54:07.650797641Z" level=info msg="StartContainer for \"016ed7afac09734f413dd3769556eea421f6bdc0d0db85b7b781d7b3ec141787\" returns successfully" Sep 12 22:54:07.656294 containerd[1640]: time="2025-09-12T22:54:07.656258518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7nf98,Uid:25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f,Namespace:calico-system,Attempt:0,}" Sep 12 22:54:07.799100 systemd-networkd[1321]: calif30872df818: Link UP Sep 12 22:54:07.800028 systemd-networkd[1321]: calif30872df818: Gained carrier Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.681 [INFO][4915] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.689 [INFO][4915] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--7nf98-eth0 csi-node-driver- calico-system 25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f 683 0 2025-09-12 22:53:40 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-7nf98 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif30872df818 [] [] }} ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Namespace="calico-system" Pod="csi-node-driver-7nf98" WorkloadEndpoint="localhost-k8s-csi--node--driver--7nf98-" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.689 [INFO][4915] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Namespace="calico-system" Pod="csi-node-driver-7nf98" WorkloadEndpoint="localhost-k8s-csi--node--driver--7nf98-eth0" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.722 [INFO][4926] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" HandleID="k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Workload="localhost-k8s-csi--node--driver--7nf98-eth0" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.724 [INFO][4926] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" HandleID="k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Workload="localhost-k8s-csi--node--driver--7nf98-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-7nf98", "timestamp":"2025-09-12 22:54:07.722439583 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.724 [INFO][4926] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.724 [INFO][4926] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.724 [INFO][4926] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.736 [INFO][4926] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.751 [INFO][4926] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.760 [INFO][4926] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.765 [INFO][4926] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.767 [INFO][4926] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.767 [INFO][4926] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.777 [INFO][4926] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.785 [INFO][4926] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.794 [INFO][4926] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.794 [INFO][4926] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" host="localhost" Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.794 [INFO][4926] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:54:07.823365 containerd[1640]: 2025-09-12 22:54:07.794 [INFO][4926] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" HandleID="k8s-pod-network.6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Workload="localhost-k8s-csi--node--driver--7nf98-eth0" Sep 12 22:54:07.824920 containerd[1640]: 2025-09-12 22:54:07.796 [INFO][4915] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Namespace="calico-system" Pod="csi-node-driver-7nf98" WorkloadEndpoint="localhost-k8s-csi--node--driver--7nf98-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7nf98-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-7nf98", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif30872df818", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:07.824920 containerd[1640]: 2025-09-12 22:54:07.796 [INFO][4915] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Namespace="calico-system" Pod="csi-node-driver-7nf98" WorkloadEndpoint="localhost-k8s-csi--node--driver--7nf98-eth0" Sep 12 22:54:07.824920 containerd[1640]: 2025-09-12 22:54:07.796 [INFO][4915] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif30872df818 ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Namespace="calico-system" Pod="csi-node-driver-7nf98" WorkloadEndpoint="localhost-k8s-csi--node--driver--7nf98-eth0" Sep 12 22:54:07.824920 containerd[1640]: 2025-09-12 22:54:07.799 [INFO][4915] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Namespace="calico-system" Pod="csi-node-driver-7nf98" WorkloadEndpoint="localhost-k8s-csi--node--driver--7nf98-eth0" Sep 12 22:54:07.824920 containerd[1640]: 2025-09-12 22:54:07.801 [INFO][4915] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Namespace="calico-system" Pod="csi-node-driver-7nf98" WorkloadEndpoint="localhost-k8s-csi--node--driver--7nf98-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7nf98-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc", Pod:"csi-node-driver-7nf98", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif30872df818", MAC:"12:d8:14:6f:57:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:07.824920 containerd[1640]: 2025-09-12 22:54:07.814 [INFO][4915] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" Namespace="calico-system" Pod="csi-node-driver-7nf98" WorkloadEndpoint="localhost-k8s-csi--node--driver--7nf98-eth0" Sep 12 22:54:07.871730 containerd[1640]: time="2025-09-12T22:54:07.871699725Z" level=info msg="connecting to shim 6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc" address="unix:///run/containerd/s/ba50f186f49219d0c61fe768381890c0c3e816e818710b130cc7a01783574e8f" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:54:07.887174 systemd[1]: Started cri-containerd-6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc.scope - libcontainer container 6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc. Sep 12 22:54:07.900596 systemd-resolved[1560]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:54:07.914955 containerd[1640]: time="2025-09-12T22:54:07.914928630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7nf98,Uid:25ac81e8-ddd8-420b-9cd2-f9ab5f554e5f,Namespace:calico-system,Attempt:0,} returns sandbox id \"6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc\"" Sep 12 22:54:07.938609 kubelet[2938]: I0912 22:54:07.938584 2938 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:54:07.957932 containerd[1640]: time="2025-09-12T22:54:07.957900895Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:07.959435 containerd[1640]: time="2025-09-12T22:54:07.959420194Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 22:54:07.960232 containerd[1640]: time="2025-09-12T22:54:07.960217486Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 417.778363ms" Sep 12 22:54:07.960265 containerd[1640]: time="2025-09-12T22:54:07.960236926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 22:54:07.961918 containerd[1640]: time="2025-09-12T22:54:07.961899424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 22:54:07.963582 containerd[1640]: time="2025-09-12T22:54:07.963562660Z" level=info msg="CreateContainer within sandbox \"c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:54:07.971607 containerd[1640]: time="2025-09-12T22:54:07.971581043Z" level=info msg="Container d6701be6b3a52ce46b6006b85ef5bed5dcd98e70c26b118225c1581ae724f99f: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:07.975963 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount499568786.mount: Deactivated successfully. Sep 12 22:54:07.980284 containerd[1640]: time="2025-09-12T22:54:07.980261770Z" level=info msg="CreateContainer within sandbox \"c01ad2ecd706dd5101147ea5c33d8dff3bd800490e54270f24979e00e4a8b745\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d6701be6b3a52ce46b6006b85ef5bed5dcd98e70c26b118225c1581ae724f99f\"" Sep 12 22:54:07.980791 containerd[1640]: time="2025-09-12T22:54:07.980776405Z" level=info msg="StartContainer for \"d6701be6b3a52ce46b6006b85ef5bed5dcd98e70c26b118225c1581ae724f99f\"" Sep 12 22:54:07.982538 containerd[1640]: time="2025-09-12T22:54:07.982468486Z" level=info msg="connecting to shim d6701be6b3a52ce46b6006b85ef5bed5dcd98e70c26b118225c1581ae724f99f" address="unix:///run/containerd/s/e383898855dfd8728e65a241f74b6a5504c895f066005a734f7e8017376b9c8b" protocol=ttrpc version=3 Sep 12 22:54:08.000355 systemd[1]: Started cri-containerd-d6701be6b3a52ce46b6006b85ef5bed5dcd98e70c26b118225c1581ae724f99f.scope - libcontainer container d6701be6b3a52ce46b6006b85ef5bed5dcd98e70c26b118225c1581ae724f99f. Sep 12 22:54:08.079493 containerd[1640]: time="2025-09-12T22:54:08.079419496Z" level=info msg="StartContainer for \"d6701be6b3a52ce46b6006b85ef5bed5dcd98e70c26b118225c1581ae724f99f\" returns successfully" Sep 12 22:54:08.111002 kubelet[2938]: I0912 22:54:08.110905 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-57c58cd6b4-k9wjk" podStartSLOduration=26.51233666 podStartE2EDuration="30.110892045s" podCreationTimestamp="2025-09-12 22:53:38 +0000 UTC" firstStartedPulling="2025-09-12 22:54:03.943424701 +0000 UTC m=+42.675929274" lastFinishedPulling="2025-09-12 22:54:07.541980086 +0000 UTC m=+46.274484659" observedRunningTime="2025-09-12 22:54:08.101287681 +0000 UTC m=+46.833792256" watchObservedRunningTime="2025-09-12 22:54:08.110892045 +0000 UTC m=+46.843396628" Sep 12 22:54:08.111499 kubelet[2938]: I0912 22:54:08.111379 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-57c58cd6b4-bbb9r" podStartSLOduration=27.042159664 podStartE2EDuration="30.111372495s" podCreationTimestamp="2025-09-12 22:53:38 +0000 UTC" firstStartedPulling="2025-09-12 22:54:04.891535352 +0000 UTC m=+43.624039925" lastFinishedPulling="2025-09-12 22:54:07.960748181 +0000 UTC m=+46.693252756" observedRunningTime="2025-09-12 22:54:08.109297345 +0000 UTC m=+46.841801928" watchObservedRunningTime="2025-09-12 22:54:08.111372495 +0000 UTC m=+46.843877077" Sep 12 22:54:08.646743 containerd[1640]: time="2025-09-12T22:54:08.646703971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c4b5b494-fw8d5,Uid:96c365fa-2018-4f8d-9c8f-e2c34613731b,Namespace:calico-system,Attempt:0,}" Sep 12 22:54:08.967642 systemd-networkd[1321]: cali5a31e12da8d: Link UP Sep 12 22:54:08.968308 systemd-networkd[1321]: cali5a31e12da8d: Gained carrier Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.779 [INFO][5071] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0 calico-kube-controllers-c4b5b494- calico-system 96c365fa-2018-4f8d-9c8f-e2c34613731b 809 0 2025-09-12 22:53:40 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:c4b5b494 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-c4b5b494-fw8d5 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali5a31e12da8d [] [] }} ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Namespace="calico-system" Pod="calico-kube-controllers-c4b5b494-fw8d5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.779 [INFO][5071] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Namespace="calico-system" Pod="calico-kube-controllers-c4b5b494-fw8d5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.879 [INFO][5084] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" HandleID="k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Workload="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.879 [INFO][5084] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" HandleID="k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Workload="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-c4b5b494-fw8d5", "timestamp":"2025-09-12 22:54:08.879354305 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.879 [INFO][5084] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.879 [INFO][5084] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.879 [INFO][5084] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.905 [INFO][5084] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.916 [INFO][5084] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.923 [INFO][5084] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.928 [INFO][5084] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.932 [INFO][5084] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.932 [INFO][5084] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.933 [INFO][5084] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.938 [INFO][5084] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.958 [INFO][5084] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.959 [INFO][5084] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" host="localhost" Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.959 [INFO][5084] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:54:09.091337 containerd[1640]: 2025-09-12 22:54:08.959 [INFO][5084] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" HandleID="k8s-pod-network.f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Workload="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" Sep 12 22:54:09.091817 containerd[1640]: 2025-09-12 22:54:08.962 [INFO][5071] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Namespace="calico-system" Pod="calico-kube-controllers-c4b5b494-fw8d5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0", GenerateName:"calico-kube-controllers-c4b5b494-", Namespace:"calico-system", SelfLink:"", UID:"96c365fa-2018-4f8d-9c8f-e2c34613731b", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c4b5b494", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-c4b5b494-fw8d5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5a31e12da8d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:09.091817 containerd[1640]: 2025-09-12 22:54:08.962 [INFO][5071] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Namespace="calico-system" Pod="calico-kube-controllers-c4b5b494-fw8d5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" Sep 12 22:54:09.091817 containerd[1640]: 2025-09-12 22:54:08.962 [INFO][5071] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5a31e12da8d ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Namespace="calico-system" Pod="calico-kube-controllers-c4b5b494-fw8d5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" Sep 12 22:54:09.091817 containerd[1640]: 2025-09-12 22:54:08.968 [INFO][5071] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Namespace="calico-system" Pod="calico-kube-controllers-c4b5b494-fw8d5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" Sep 12 22:54:09.091817 containerd[1640]: 2025-09-12 22:54:08.968 [INFO][5071] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Namespace="calico-system" Pod="calico-kube-controllers-c4b5b494-fw8d5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0", GenerateName:"calico-kube-controllers-c4b5b494-", Namespace:"calico-system", SelfLink:"", UID:"96c365fa-2018-4f8d-9c8f-e2c34613731b", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c4b5b494", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c", Pod:"calico-kube-controllers-c4b5b494-fw8d5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5a31e12da8d", MAC:"da:86:92:54:d0:eb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:54:09.091817 containerd[1640]: 2025-09-12 22:54:08.983 [INFO][5071] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" Namespace="calico-system" Pod="calico-kube-controllers-c4b5b494-fw8d5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c4b5b494--fw8d5-eth0" Sep 12 22:54:09.117995 kubelet[2938]: I0912 22:54:09.117974 2938 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:54:09.118888 kubelet[2938]: I0912 22:54:09.118177 2938 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:54:09.159187 containerd[1640]: time="2025-09-12T22:54:09.159127465Z" level=info msg="connecting to shim f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c" address="unix:///run/containerd/s/e4664a9267e9963b92bf4e8fe46a8b05d879837a439946de883c75c1e1c9ec68" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:54:09.186341 systemd[1]: Started cri-containerd-f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c.scope - libcontainer container f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c. Sep 12 22:54:09.264305 systemd-resolved[1560]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:54:09.379563 containerd[1640]: time="2025-09-12T22:54:09.379033340Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c4b5b494-fw8d5,Uid:96c365fa-2018-4f8d-9c8f-e2c34613731b,Namespace:calico-system,Attempt:0,} returns sandbox id \"f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c\"" Sep 12 22:54:09.400777 systemd-networkd[1321]: vxlan.calico: Link UP Sep 12 22:54:09.401804 systemd-networkd[1321]: vxlan.calico: Gained carrier Sep 12 22:54:09.566182 systemd-networkd[1321]: calif30872df818: Gained IPv6LL Sep 12 22:54:10.718261 systemd-networkd[1321]: cali5a31e12da8d: Gained IPv6LL Sep 12 22:54:10.782240 systemd-networkd[1321]: vxlan.calico: Gained IPv6LL Sep 12 22:54:11.979344 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3032899951.mount: Deactivated successfully. Sep 12 22:54:13.403306 containerd[1640]: time="2025-09-12T22:54:13.403197059Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:13.426669 containerd[1640]: time="2025-09-12T22:54:13.426499006Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 22:54:13.429949 containerd[1640]: time="2025-09-12T22:54:13.429929697Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:13.431730 containerd[1640]: time="2025-09-12T22:54:13.431704462Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:13.434483 containerd[1640]: time="2025-09-12T22:54:13.434459031Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 5.470570946s" Sep 12 22:54:13.435180 containerd[1640]: time="2025-09-12T22:54:13.434488054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 22:54:13.439311 containerd[1640]: time="2025-09-12T22:54:13.439163490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 22:54:13.440138 containerd[1640]: time="2025-09-12T22:54:13.439170935Z" level=info msg="CreateContainer within sandbox \"764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 22:54:13.468350 containerd[1640]: time="2025-09-12T22:54:13.468324016Z" level=info msg="Container e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:13.474292 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3645159549.mount: Deactivated successfully. Sep 12 22:54:13.495624 containerd[1640]: time="2025-09-12T22:54:13.495602437Z" level=info msg="CreateContainer within sandbox \"764fd924bbbe4386d3dd2524f949e80730aeea46f7ad9ca1933cd53e752e21e6\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\"" Sep 12 22:54:13.497563 containerd[1640]: time="2025-09-12T22:54:13.496149521Z" level=info msg="StartContainer for \"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\"" Sep 12 22:54:13.500101 containerd[1640]: time="2025-09-12T22:54:13.500030050Z" level=info msg="connecting to shim e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc" address="unix:///run/containerd/s/b4e27d0179dd670934ed95065254630e3d56bf2337a611b3b321bf145313d85d" protocol=ttrpc version=3 Sep 12 22:54:13.620122 systemd[1]: Started cri-containerd-e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc.scope - libcontainer container e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc. Sep 12 22:54:13.679104 containerd[1640]: time="2025-09-12T22:54:13.678442690Z" level=info msg="StartContainer for \"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\" returns successfully" Sep 12 22:54:14.246080 kubelet[2938]: I0912 22:54:14.246022 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-vzktr" podStartSLOduration=26.976050559 podStartE2EDuration="34.246009298s" podCreationTimestamp="2025-09-12 22:53:40 +0000 UTC" firstStartedPulling="2025-09-12 22:54:06.165467218 +0000 UTC m=+44.897971791" lastFinishedPulling="2025-09-12 22:54:13.435425957 +0000 UTC m=+52.167930530" observedRunningTime="2025-09-12 22:54:14.233144941 +0000 UTC m=+52.965649515" watchObservedRunningTime="2025-09-12 22:54:14.246009298 +0000 UTC m=+52.978513875" Sep 12 22:54:14.305479 containerd[1640]: time="2025-09-12T22:54:14.305453425Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\" id:\"a57e42207fa3e1479a9f62e852c57234bd7311841ae7a851057cd166f2ff5db3\" pid:5316 exit_status:1 exited_at:{seconds:1757717654 nanos:299150891}" Sep 12 22:54:14.786499 containerd[1640]: time="2025-09-12T22:54:14.786472641Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:14.787293 containerd[1640]: time="2025-09-12T22:54:14.787261210Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 22:54:14.788316 containerd[1640]: time="2025-09-12T22:54:14.787433859Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:14.788873 containerd[1640]: time="2025-09-12T22:54:14.788754527Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:14.789581 containerd[1640]: time="2025-09-12T22:54:14.789517227Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.350327178s" Sep 12 22:54:14.789581 containerd[1640]: time="2025-09-12T22:54:14.789534132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 22:54:14.801232 containerd[1640]: time="2025-09-12T22:54:14.801204807Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 22:54:14.802203 containerd[1640]: time="2025-09-12T22:54:14.802111853Z" level=info msg="CreateContainer within sandbox \"6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 22:54:14.814543 containerd[1640]: time="2025-09-12T22:54:14.814526300Z" level=info msg="Container 6a94140b51855893c2fde1b0fe16edba03810cf955a49e387c39a98277d8d735: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:14.833230 containerd[1640]: time="2025-09-12T22:54:14.833166575Z" level=info msg="CreateContainer within sandbox \"6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"6a94140b51855893c2fde1b0fe16edba03810cf955a49e387c39a98277d8d735\"" Sep 12 22:54:14.834207 containerd[1640]: time="2025-09-12T22:54:14.834196557Z" level=info msg="StartContainer for \"6a94140b51855893c2fde1b0fe16edba03810cf955a49e387c39a98277d8d735\"" Sep 12 22:54:14.835660 containerd[1640]: time="2025-09-12T22:54:14.835648424Z" level=info msg="connecting to shim 6a94140b51855893c2fde1b0fe16edba03810cf955a49e387c39a98277d8d735" address="unix:///run/containerd/s/ba50f186f49219d0c61fe768381890c0c3e816e818710b130cc7a01783574e8f" protocol=ttrpc version=3 Sep 12 22:54:14.854178 systemd[1]: Started cri-containerd-6a94140b51855893c2fde1b0fe16edba03810cf955a49e387c39a98277d8d735.scope - libcontainer container 6a94140b51855893c2fde1b0fe16edba03810cf955a49e387c39a98277d8d735. Sep 12 22:54:14.885333 containerd[1640]: time="2025-09-12T22:54:14.885298262Z" level=info msg="StartContainer for \"6a94140b51855893c2fde1b0fe16edba03810cf955a49e387c39a98277d8d735\" returns successfully" Sep 12 22:54:15.244134 containerd[1640]: time="2025-09-12T22:54:15.244096700Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\" id:\"f015e812d979424ceb33d6796913a4dba4c445710de4e2cb18c6d95474cb0ba5\" pid:5382 exit_status:1 exited_at:{seconds:1757717655 nanos:243860669}" Sep 12 22:54:16.264691 containerd[1640]: time="2025-09-12T22:54:16.264665316Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\" id:\"3c3cbeae9915c297b87d083e2ecb6421cde370bb87de2b4ae4d8373b9da70d28\" pid:5405 exited_at:{seconds:1757717656 nanos:264353978}" Sep 12 22:54:18.563999 containerd[1640]: time="2025-09-12T22:54:18.563958123Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:18.568479 containerd[1640]: time="2025-09-12T22:54:18.568457564Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 22:54:18.570618 containerd[1640]: time="2025-09-12T22:54:18.570090671Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:18.572101 containerd[1640]: time="2025-09-12T22:54:18.572080665Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:18.572735 containerd[1640]: time="2025-09-12T22:54:18.572717111Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.771483673s" Sep 12 22:54:18.572819 containerd[1640]: time="2025-09-12T22:54:18.572807517Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 22:54:18.576372 containerd[1640]: time="2025-09-12T22:54:18.576352949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 22:54:18.671193 containerd[1640]: time="2025-09-12T22:54:18.670822270Z" level=info msg="CreateContainer within sandbox \"f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 22:54:18.675229 containerd[1640]: time="2025-09-12T22:54:18.675212041Z" level=info msg="Container a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:18.678957 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3262116238.mount: Deactivated successfully. Sep 12 22:54:18.689053 containerd[1640]: time="2025-09-12T22:54:18.688958032Z" level=info msg="CreateContainer within sandbox \"f6263bb900aa92d4b59c04c822b31c0ac6ee685ec2cd304db3f37117a119497c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\"" Sep 12 22:54:18.689781 containerd[1640]: time="2025-09-12T22:54:18.689416362Z" level=info msg="StartContainer for \"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\"" Sep 12 22:54:18.689999 containerd[1640]: time="2025-09-12T22:54:18.689982349Z" level=info msg="connecting to shim a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3" address="unix:///run/containerd/s/e4664a9267e9963b92bf4e8fe46a8b05d879837a439946de883c75c1e1c9ec68" protocol=ttrpc version=3 Sep 12 22:54:18.714139 systemd[1]: Started cri-containerd-a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3.scope - libcontainer container a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3. Sep 12 22:54:18.771710 containerd[1640]: time="2025-09-12T22:54:18.771648898Z" level=info msg="StartContainer for \"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\" returns successfully" Sep 12 22:54:19.528084 kubelet[2938]: I0912 22:54:19.526307 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-c4b5b494-fw8d5" podStartSLOduration=30.324501461 podStartE2EDuration="39.520034846s" podCreationTimestamp="2025-09-12 22:53:40 +0000 UTC" firstStartedPulling="2025-09-12 22:54:09.380651092 +0000 UTC m=+48.113155665" lastFinishedPulling="2025-09-12 22:54:18.576184472 +0000 UTC m=+57.308689050" observedRunningTime="2025-09-12 22:54:19.517320828 +0000 UTC m=+58.249825411" watchObservedRunningTime="2025-09-12 22:54:19.520034846 +0000 UTC m=+58.252539421" Sep 12 22:54:19.556790 containerd[1640]: time="2025-09-12T22:54:19.556738190Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\" id:\"feef3397a21270b16ef7fa4eeaf8d41f657f1f4a7c5e05c180cfb999c025f0aa\" pid:5478 exited_at:{seconds:1757717659 nanos:556266782}" Sep 12 22:54:20.356192 containerd[1640]: time="2025-09-12T22:54:20.356155768Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:20.359299 containerd[1640]: time="2025-09-12T22:54:20.359177919Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 22:54:20.361486 containerd[1640]: time="2025-09-12T22:54:20.361470306Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:20.365750 containerd[1640]: time="2025-09-12T22:54:20.365734337Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:54:20.366110 containerd[1640]: time="2025-09-12T22:54:20.366095011Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.789720471s" Sep 12 22:54:20.366164 containerd[1640]: time="2025-09-12T22:54:20.366154672Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 22:54:20.370604 containerd[1640]: time="2025-09-12T22:54:20.368144700Z" level=info msg="CreateContainer within sandbox \"6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 22:54:20.403067 containerd[1640]: time="2025-09-12T22:54:20.402089432Z" level=info msg="Container 5a7e3468d37ca0dbe63be4ce128966a7b8dda9655ee07cfb455274a63ee82d5d: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:54:20.448859 containerd[1640]: time="2025-09-12T22:54:20.448793193Z" level=info msg="CreateContainer within sandbox \"6d38db528f811b2aff79d34c1b06a7ed8e16a3b2070f01266ccc4393e4e94cfc\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5a7e3468d37ca0dbe63be4ce128966a7b8dda9655ee07cfb455274a63ee82d5d\"" Sep 12 22:54:20.450841 containerd[1640]: time="2025-09-12T22:54:20.449271726Z" level=info msg="StartContainer for \"5a7e3468d37ca0dbe63be4ce128966a7b8dda9655ee07cfb455274a63ee82d5d\"" Sep 12 22:54:20.450841 containerd[1640]: time="2025-09-12T22:54:20.450275713Z" level=info msg="connecting to shim 5a7e3468d37ca0dbe63be4ce128966a7b8dda9655ee07cfb455274a63ee82d5d" address="unix:///run/containerd/s/ba50f186f49219d0c61fe768381890c0c3e816e818710b130cc7a01783574e8f" protocol=ttrpc version=3 Sep 12 22:54:20.469140 systemd[1]: Started cri-containerd-5a7e3468d37ca0dbe63be4ce128966a7b8dda9655ee07cfb455274a63ee82d5d.scope - libcontainer container 5a7e3468d37ca0dbe63be4ce128966a7b8dda9655ee07cfb455274a63ee82d5d. Sep 12 22:54:20.525534 containerd[1640]: time="2025-09-12T22:54:20.525503035Z" level=info msg="StartContainer for \"5a7e3468d37ca0dbe63be4ce128966a7b8dda9655ee07cfb455274a63ee82d5d\" returns successfully" Sep 12 22:54:20.912697 containerd[1640]: time="2025-09-12T22:54:20.912640271Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289\" id:\"f9395b9c61a273c3231f1401e859c063840e0539c8f71e67fb52fe5fd02011a4\" pid:5542 exited_at:{seconds:1757717660 nanos:912273688}" Sep 12 22:54:20.927769 kubelet[2938]: I0912 22:54:20.911229 2938 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 22:54:20.930083 kubelet[2938]: I0912 22:54:20.929686 2938 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 22:54:22.928827 containerd[1640]: time="2025-09-12T22:54:22.928800315Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\" id:\"f656813924d5bc274f27666bd017315a5d53d88f452ed400f8588fa46bdbddd8\" pid:5568 exited_at:{seconds:1757717662 nanos:928604783}" Sep 12 22:54:23.046562 containerd[1640]: time="2025-09-12T22:54:23.046507354Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\" id:\"d5ad3c8315b503a598ea86e51b56d29d36887d77c3771e009dbf81a42722ba09\" pid:5591 exited_at:{seconds:1757717663 nanos:46206147}" Sep 12 22:54:33.049824 containerd[1640]: time="2025-09-12T22:54:33.049794410Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\" id:\"f68b85f612a898a477fbf8c4197f3b6688b9eb1150c426ef38a1cdac00769e76\" pid:5627 exited_at:{seconds:1757717673 nanos:43822740}" Sep 12 22:54:39.802896 systemd[1]: Started sshd@7-139.178.70.110:22-147.75.109.163:49786.service - OpenSSH per-connection server daemon (147.75.109.163:49786). Sep 12 22:54:40.035707 sshd[5643]: Accepted publickey for core from 147.75.109.163 port 49786 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:54:40.038910 sshd-session[5643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:54:40.048078 systemd-logind[1620]: New session 10 of user core. Sep 12 22:54:40.053023 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 22:54:40.806156 sshd[5646]: Connection closed by 147.75.109.163 port 49786 Sep 12 22:54:40.807593 sshd-session[5643]: pam_unix(sshd:session): session closed for user core Sep 12 22:54:40.812809 systemd[1]: sshd@7-139.178.70.110:22-147.75.109.163:49786.service: Deactivated successfully. Sep 12 22:54:40.814964 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 22:54:40.820462 systemd-logind[1620]: Session 10 logged out. Waiting for processes to exit. Sep 12 22:54:40.821670 systemd-logind[1620]: Removed session 10. Sep 12 22:54:43.092914 kubelet[2938]: I0912 22:54:43.074361 2938 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:54:43.388475 kubelet[2938]: I0912 22:54:43.388364 2938 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-7nf98" podStartSLOduration=50.929428659 podStartE2EDuration="1m3.3802668s" podCreationTimestamp="2025-09-12 22:53:40 +0000 UTC" firstStartedPulling="2025-09-12 22:54:07.915750234 +0000 UTC m=+46.648254807" lastFinishedPulling="2025-09-12 22:54:20.366588373 +0000 UTC m=+59.099092948" observedRunningTime="2025-09-12 22:54:21.512665856 +0000 UTC m=+60.245170433" watchObservedRunningTime="2025-09-12 22:54:43.3802668 +0000 UTC m=+82.112771382" Sep 12 22:54:43.552172 kubelet[2938]: I0912 22:54:43.552138 2938 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:54:45.824305 systemd[1]: Started sshd@8-139.178.70.110:22-147.75.109.163:53704.service - OpenSSH per-connection server daemon (147.75.109.163:53704). Sep 12 22:54:45.924812 sshd[5671]: Accepted publickey for core from 147.75.109.163 port 53704 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:54:45.926064 sshd-session[5671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:54:45.929073 systemd-logind[1620]: New session 11 of user core. Sep 12 22:54:45.934129 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 22:54:46.218647 sshd[5674]: Connection closed by 147.75.109.163 port 53704 Sep 12 22:54:46.221198 sshd-session[5671]: pam_unix(sshd:session): session closed for user core Sep 12 22:54:46.225052 systemd[1]: sshd@8-139.178.70.110:22-147.75.109.163:53704.service: Deactivated successfully. Sep 12 22:54:46.226300 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 22:54:46.226844 systemd-logind[1620]: Session 11 logged out. Waiting for processes to exit. Sep 12 22:54:46.227618 systemd-logind[1620]: Removed session 11. Sep 12 22:54:51.233427 systemd[1]: Started sshd@9-139.178.70.110:22-147.75.109.163:42812.service - OpenSSH per-connection server daemon (147.75.109.163:42812). Sep 12 22:54:51.588023 sshd[5719]: Accepted publickey for core from 147.75.109.163 port 42812 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:54:51.590236 sshd-session[5719]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:54:51.601680 systemd-logind[1620]: New session 12 of user core. Sep 12 22:54:51.606197 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 22:54:51.901233 containerd[1640]: time="2025-09-12T22:54:51.901106426Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289\" id:\"90dce03b95cac5eb362fd94d359e5496ef9c7985377645a8e0d18af0c064978b\" pid:5708 exited_at:{seconds:1757717691 nanos:801144916}" Sep 12 22:54:52.015374 sshd[5722]: Connection closed by 147.75.109.163 port 42812 Sep 12 22:54:52.016424 sshd-session[5719]: pam_unix(sshd:session): session closed for user core Sep 12 22:54:52.024022 systemd[1]: Started sshd@10-139.178.70.110:22-147.75.109.163:42826.service - OpenSSH per-connection server daemon (147.75.109.163:42826). Sep 12 22:54:52.024572 systemd[1]: sshd@9-139.178.70.110:22-147.75.109.163:42812.service: Deactivated successfully. Sep 12 22:54:52.027014 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 22:54:52.028602 systemd-logind[1620]: Session 12 logged out. Waiting for processes to exit. Sep 12 22:54:52.030748 systemd-logind[1620]: Removed session 12. Sep 12 22:54:52.107665 sshd[5733]: Accepted publickey for core from 147.75.109.163 port 42826 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:54:52.108464 sshd-session[5733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:54:52.111227 systemd-logind[1620]: New session 13 of user core. Sep 12 22:54:52.114120 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 22:54:52.285902 sshd[5739]: Connection closed by 147.75.109.163 port 42826 Sep 12 22:54:52.287079 sshd-session[5733]: pam_unix(sshd:session): session closed for user core Sep 12 22:54:52.294007 systemd[1]: sshd@10-139.178.70.110:22-147.75.109.163:42826.service: Deactivated successfully. Sep 12 22:54:52.297435 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 22:54:52.299447 systemd-logind[1620]: Session 13 logged out. Waiting for processes to exit. Sep 12 22:54:52.303868 systemd[1]: Started sshd@11-139.178.70.110:22-147.75.109.163:42834.service - OpenSSH per-connection server daemon (147.75.109.163:42834). Sep 12 22:54:52.306354 systemd-logind[1620]: Removed session 13. Sep 12 22:54:52.379653 sshd[5748]: Accepted publickey for core from 147.75.109.163 port 42834 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:54:52.382298 sshd-session[5748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:54:52.391918 systemd-logind[1620]: New session 14 of user core. Sep 12 22:54:52.397175 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 22:54:52.582956 sshd[5751]: Connection closed by 147.75.109.163 port 42834 Sep 12 22:54:52.582350 sshd-session[5748]: pam_unix(sshd:session): session closed for user core Sep 12 22:54:52.584966 systemd-logind[1620]: Session 14 logged out. Waiting for processes to exit. Sep 12 22:54:52.585128 systemd[1]: sshd@11-139.178.70.110:22-147.75.109.163:42834.service: Deactivated successfully. Sep 12 22:54:52.586603 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 22:54:52.589144 systemd-logind[1620]: Removed session 14. Sep 12 22:54:52.979752 containerd[1640]: time="2025-09-12T22:54:52.979724760Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\" id:\"0f5ed3febaacac026b9f465834b9d651b203c33614d433f8890e22d71d71cd79\" pid:5774 exited_at:{seconds:1757717692 nanos:979401262}" Sep 12 22:54:54.565903 containerd[1640]: time="2025-09-12T22:54:54.565850553Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\" id:\"d8d2804562958ba83270e37ef640cce0cec1f66603993ae7bccd9b0c47d889d1\" pid:5793 exited_at:{seconds:1757717694 nanos:565622188}" Sep 12 22:54:56.514401 containerd[1640]: time="2025-09-12T22:54:56.514367655Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\" id:\"df6bb6f45d66b6d107b77ebf69a2c69cd4926f8cb4aafdce7c014c1b88203184\" pid:5821 exited_at:{seconds:1757717696 nanos:514072207}" Sep 12 22:54:57.594654 systemd[1]: Started sshd@12-139.178.70.110:22-147.75.109.163:42842.service - OpenSSH per-connection server daemon (147.75.109.163:42842). Sep 12 22:54:57.922373 sshd[5836]: Accepted publickey for core from 147.75.109.163 port 42842 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:54:57.925349 sshd-session[5836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:54:57.930551 systemd-logind[1620]: New session 15 of user core. Sep 12 22:54:57.935186 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 22:54:58.901221 sshd[5839]: Connection closed by 147.75.109.163 port 42842 Sep 12 22:54:58.904590 sshd-session[5836]: pam_unix(sshd:session): session closed for user core Sep 12 22:54:58.927589 systemd-logind[1620]: Session 15 logged out. Waiting for processes to exit. Sep 12 22:54:58.927613 systemd[1]: sshd@12-139.178.70.110:22-147.75.109.163:42842.service: Deactivated successfully. Sep 12 22:54:58.928901 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 22:54:58.931112 systemd-logind[1620]: Removed session 15. Sep 12 22:55:03.910547 systemd[1]: Started sshd@13-139.178.70.110:22-147.75.109.163:46264.service - OpenSSH per-connection server daemon (147.75.109.163:46264). Sep 12 22:55:04.392344 sshd[5855]: Accepted publickey for core from 147.75.109.163 port 46264 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:55:04.417798 sshd-session[5855]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:55:04.425185 systemd-logind[1620]: New session 16 of user core. Sep 12 22:55:04.429199 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 22:55:05.589627 sshd[5858]: Connection closed by 147.75.109.163 port 46264 Sep 12 22:55:05.590062 sshd-session[5855]: pam_unix(sshd:session): session closed for user core Sep 12 22:55:05.594490 systemd[1]: sshd@13-139.178.70.110:22-147.75.109.163:46264.service: Deactivated successfully. Sep 12 22:55:05.597679 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 22:55:05.598793 systemd-logind[1620]: Session 16 logged out. Waiting for processes to exit. Sep 12 22:55:05.599841 systemd-logind[1620]: Removed session 16. Sep 12 22:55:10.601687 systemd[1]: Started sshd@14-139.178.70.110:22-147.75.109.163:43490.service - OpenSSH per-connection server daemon (147.75.109.163:43490). Sep 12 22:55:10.680483 sshd[5873]: Accepted publickey for core from 147.75.109.163 port 43490 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:55:10.682363 sshd-session[5873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:55:10.689125 systemd-logind[1620]: New session 17 of user core. Sep 12 22:55:10.695404 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 22:55:10.901611 sshd[5876]: Connection closed by 147.75.109.163 port 43490 Sep 12 22:55:10.902137 sshd-session[5873]: pam_unix(sshd:session): session closed for user core Sep 12 22:55:10.910787 systemd[1]: sshd@14-139.178.70.110:22-147.75.109.163:43490.service: Deactivated successfully. Sep 12 22:55:10.912975 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 22:55:10.914017 systemd-logind[1620]: Session 17 logged out. Waiting for processes to exit. Sep 12 22:55:10.916004 systemd-logind[1620]: Removed session 17. Sep 12 22:55:10.923884 systemd[1]: Started sshd@15-139.178.70.110:22-147.75.109.163:43506.service - OpenSSH per-connection server daemon (147.75.109.163:43506). Sep 12 22:55:11.010891 sshd[5888]: Accepted publickey for core from 147.75.109.163 port 43506 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:55:11.012018 sshd-session[5888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:55:11.015857 systemd-logind[1620]: New session 18 of user core. Sep 12 22:55:11.022164 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 22:55:13.415237 sshd[5891]: Connection closed by 147.75.109.163 port 43506 Sep 12 22:55:13.417657 sshd-session[5888]: pam_unix(sshd:session): session closed for user core Sep 12 22:55:13.425900 systemd[1]: Started sshd@16-139.178.70.110:22-147.75.109.163:43508.service - OpenSSH per-connection server daemon (147.75.109.163:43508). Sep 12 22:55:13.431869 systemd-logind[1620]: Session 18 logged out. Waiting for processes to exit. Sep 12 22:55:13.432220 systemd[1]: sshd@15-139.178.70.110:22-147.75.109.163:43506.service: Deactivated successfully. Sep 12 22:55:13.434937 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 22:55:13.437124 systemd-logind[1620]: Removed session 18. Sep 12 22:55:13.657410 sshd[5898]: Accepted publickey for core from 147.75.109.163 port 43508 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:55:13.662637 sshd-session[5898]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:55:13.675103 systemd-logind[1620]: New session 19 of user core. Sep 12 22:55:13.680179 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 22:55:16.506195 sshd[5904]: Connection closed by 147.75.109.163 port 43508 Sep 12 22:55:16.536692 sshd-session[5898]: pam_unix(sshd:session): session closed for user core Sep 12 22:55:16.598754 systemd[1]: Started sshd@17-139.178.70.110:22-147.75.109.163:43514.service - OpenSSH per-connection server daemon (147.75.109.163:43514). Sep 12 22:55:16.599698 systemd[1]: sshd@16-139.178.70.110:22-147.75.109.163:43508.service: Deactivated successfully. Sep 12 22:55:16.601518 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 22:55:16.601681 systemd[1]: session-19.scope: Consumed 428ms CPU time, 75M memory peak. Sep 12 22:55:16.604182 systemd-logind[1620]: Session 19 logged out. Waiting for processes to exit. Sep 12 22:55:16.608502 systemd-logind[1620]: Removed session 19. Sep 12 22:55:16.811744 sshd[5925]: Accepted publickey for core from 147.75.109.163 port 43514 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:55:16.819329 sshd-session[5925]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:55:16.845582 systemd-logind[1620]: New session 20 of user core. Sep 12 22:55:16.851336 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 22:55:18.865612 kubelet[2938]: E0912 22:55:18.855351 2938 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.001s" Sep 12 22:55:22.747195 kubelet[2938]: E0912 22:55:21.999607 2938 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.271s" Sep 12 22:55:23.789701 containerd[1640]: time="2025-09-12T22:55:23.789597901Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\" id:\"581a1c8c644390ff085100f3eb85a41c6d6bf075c3072abb577283bc5c6925c9\" pid:5987 exited_at:{seconds:1757717723 nanos:726523565}" Sep 12 22:55:25.080027 containerd[1640]: time="2025-09-12T22:55:25.079981297Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0f09bf51bc0c9d4cf6d90391393673ad48ef389849ed284aec4738a82142fcc\" id:\"b1201c2a0c3487cda8b1fd8cae623602edf5dc370ddef9bd62d097abf642cd17\" pid:5985 exited_at:{seconds:1757717725 nanos:51111452}" Sep 12 22:55:25.195950 containerd[1640]: time="2025-09-12T22:55:25.195889248Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dfc288d9eeb26cb6405bbc5084dca45b6ba259b6a99c604854e28134cf15c289\" id:\"9eba49585a12a3dfdc1118a39d6d628afd01af5ce0f651364d2180b43d12415d\" pid:6000 exited_at:{seconds:1757717725 nanos:195394541}" Sep 12 22:55:25.411827 sshd[5934]: Connection closed by 147.75.109.163 port 43514 Sep 12 22:55:25.434154 sshd-session[5925]: pam_unix(sshd:session): session closed for user core Sep 12 22:55:25.487290 systemd[1]: Started sshd@18-139.178.70.110:22-147.75.109.163:46892.service - OpenSSH per-connection server daemon (147.75.109.163:46892). Sep 12 22:55:25.487603 systemd[1]: sshd@17-139.178.70.110:22-147.75.109.163:43514.service: Deactivated successfully. Sep 12 22:55:25.490740 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 22:55:25.490866 systemd[1]: session-20.scope: Consumed 1.051s CPU time, 63.2M memory peak. Sep 12 22:55:25.493970 systemd-logind[1620]: Session 20 logged out. Waiting for processes to exit. Sep 12 22:55:25.496828 systemd-logind[1620]: Removed session 20. Sep 12 22:55:25.671920 sshd[6023]: Accepted publickey for core from 147.75.109.163 port 46892 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:55:25.674198 sshd-session[6023]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:55:25.686231 systemd-logind[1620]: New session 21 of user core. Sep 12 22:55:25.692181 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 22:55:26.309243 sshd[6029]: Connection closed by 147.75.109.163 port 46892 Sep 12 22:55:26.308757 sshd-session[6023]: pam_unix(sshd:session): session closed for user core Sep 12 22:55:26.311321 systemd[1]: sshd@18-139.178.70.110:22-147.75.109.163:46892.service: Deactivated successfully. Sep 12 22:55:26.312727 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 22:55:26.314466 systemd-logind[1620]: Session 21 logged out. Waiting for processes to exit. Sep 12 22:55:26.315105 systemd-logind[1620]: Removed session 21. Sep 12 22:55:31.319856 systemd[1]: Started sshd@19-139.178.70.110:22-147.75.109.163:35808.service - OpenSSH per-connection server daemon (147.75.109.163:35808). Sep 12 22:55:31.593292 sshd[6053]: Accepted publickey for core from 147.75.109.163 port 35808 ssh2: RSA SHA256:tW2vbxPK/u/o8XyjCJxMeIgPoVf9em1FudQ+OeUFsoY Sep 12 22:55:31.595313 sshd-session[6053]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:55:31.598996 systemd-logind[1620]: New session 22 of user core. Sep 12 22:55:31.600236 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 22:55:31.843509 sshd[6056]: Connection closed by 147.75.109.163 port 35808 Sep 12 22:55:31.844922 sshd-session[6053]: pam_unix(sshd:session): session closed for user core Sep 12 22:55:31.847190 systemd[1]: sshd@19-139.178.70.110:22-147.75.109.163:35808.service: Deactivated successfully. Sep 12 22:55:31.848706 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 22:55:31.850432 systemd-logind[1620]: Session 22 logged out. Waiting for processes to exit. Sep 12 22:55:31.851902 systemd-logind[1620]: Removed session 22. Sep 12 22:55:33.137643 containerd[1640]: time="2025-09-12T22:55:33.137598219Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a8aded671c30bc0ad86d95bb69384bf6c10a6bda93ec2cfa309c0039cc8f10c3\" id:\"9068e90f2c45f89b730f69fa6a98c5ee51cd6b93f56bc9170259ecbbedb0cd2a\" pid:6079 exited_at:{seconds:1757717733 nanos:137148816}"