Sep 9 00:11:44.712049 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Sep 8 22:16:40 -00 2025 Sep 9 00:11:44.712066 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=c495f73c03808403ea4f55eb54c843aae6678d256d64068b1371f8afce28979a Sep 9 00:11:44.712072 kernel: Disabled fast string operations Sep 9 00:11:44.712076 kernel: BIOS-provided physical RAM map: Sep 9 00:11:44.712080 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Sep 9 00:11:44.712085 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Sep 9 00:11:44.712090 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Sep 9 00:11:44.712095 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Sep 9 00:11:44.712099 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Sep 9 00:11:44.712103 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Sep 9 00:11:44.712107 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Sep 9 00:11:44.712112 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Sep 9 00:11:44.712116 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Sep 9 00:11:44.712120 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 9 00:11:44.712126 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Sep 9 00:11:44.712131 kernel: NX (Execute Disable) protection: active Sep 9 00:11:44.712136 kernel: APIC: Static calls initialized Sep 9 00:11:44.712141 kernel: SMBIOS 2.7 present. Sep 9 00:11:44.712146 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Sep 9 00:11:44.712151 kernel: DMI: Memory slots populated: 1/128 Sep 9 00:11:44.712156 kernel: vmware: hypercall mode: 0x00 Sep 9 00:11:44.712161 kernel: Hypervisor detected: VMware Sep 9 00:11:44.712166 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Sep 9 00:11:44.712171 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Sep 9 00:11:44.712175 kernel: vmware: using clock offset of 3729756321 ns Sep 9 00:11:44.712180 kernel: tsc: Detected 3408.000 MHz processor Sep 9 00:11:44.712185 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 00:11:44.712191 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 00:11:44.712196 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Sep 9 00:11:44.712201 kernel: total RAM covered: 3072M Sep 9 00:11:44.712207 kernel: Found optimal setting for mtrr clean up Sep 9 00:11:44.712212 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Sep 9 00:11:44.712217 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Sep 9 00:11:44.712222 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 00:11:44.712227 kernel: Using GB pages for direct mapping Sep 9 00:11:44.712232 kernel: ACPI: Early table checksum verification disabled Sep 9 00:11:44.712237 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Sep 9 00:11:44.712242 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Sep 9 00:11:44.712247 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Sep 9 00:11:44.712253 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Sep 9 00:11:44.712260 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 9 00:11:44.712265 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 9 00:11:44.712270 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Sep 9 00:11:44.712276 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Sep 9 00:11:44.712282 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Sep 9 00:11:44.712287 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Sep 9 00:11:44.712292 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Sep 9 00:11:44.712297 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Sep 9 00:11:44.712303 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Sep 9 00:11:44.712308 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Sep 9 00:11:44.712313 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 9 00:11:44.712318 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 9 00:11:44.712323 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Sep 9 00:11:44.712329 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Sep 9 00:11:44.712334 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Sep 9 00:11:44.712339 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Sep 9 00:11:44.712344 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Sep 9 00:11:44.712350 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Sep 9 00:11:44.712355 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 9 00:11:44.712360 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 9 00:11:44.712365 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Sep 9 00:11:44.712370 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Sep 9 00:11:44.712375 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Sep 9 00:11:44.712381 kernel: Zone ranges: Sep 9 00:11:44.712387 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 00:11:44.712392 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Sep 9 00:11:44.712397 kernel: Normal empty Sep 9 00:11:44.712402 kernel: Device empty Sep 9 00:11:44.712407 kernel: Movable zone start for each node Sep 9 00:11:44.712412 kernel: Early memory node ranges Sep 9 00:11:44.712417 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Sep 9 00:11:44.712435 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Sep 9 00:11:44.712448 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Sep 9 00:11:44.712455 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Sep 9 00:11:44.712460 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 00:11:44.712466 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Sep 9 00:11:44.712471 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Sep 9 00:11:44.712476 kernel: ACPI: PM-Timer IO Port: 0x1008 Sep 9 00:11:44.712481 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Sep 9 00:11:44.712487 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 9 00:11:44.712492 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 9 00:11:44.712497 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 9 00:11:44.712503 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 9 00:11:44.712508 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 9 00:11:44.712513 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 9 00:11:44.712518 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 9 00:11:44.712523 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 9 00:11:44.712528 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 9 00:11:44.712533 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 9 00:11:44.712538 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 9 00:11:44.712543 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 9 00:11:44.712548 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 9 00:11:44.712554 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 9 00:11:44.712559 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 9 00:11:44.712564 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 9 00:11:44.712573 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Sep 9 00:11:44.712578 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Sep 9 00:11:44.712583 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Sep 9 00:11:44.712588 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Sep 9 00:11:44.712593 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Sep 9 00:11:44.712598 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Sep 9 00:11:44.712607 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Sep 9 00:11:44.712613 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Sep 9 00:11:44.712618 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Sep 9 00:11:44.712623 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Sep 9 00:11:44.712629 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Sep 9 00:11:44.712634 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Sep 9 00:11:44.712639 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Sep 9 00:11:44.712644 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Sep 9 00:11:44.712649 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Sep 9 00:11:44.712654 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Sep 9 00:11:44.712660 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Sep 9 00:11:44.712666 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Sep 9 00:11:44.712671 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Sep 9 00:11:44.712676 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Sep 9 00:11:44.712681 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Sep 9 00:11:44.712686 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Sep 9 00:11:44.712691 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Sep 9 00:11:44.712700 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Sep 9 00:11:44.712706 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Sep 9 00:11:44.712711 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Sep 9 00:11:44.712717 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Sep 9 00:11:44.712723 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Sep 9 00:11:44.712728 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Sep 9 00:11:44.712734 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Sep 9 00:11:44.712739 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Sep 9 00:11:44.712745 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Sep 9 00:11:44.712750 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Sep 9 00:11:44.712755 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Sep 9 00:11:44.712762 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Sep 9 00:11:44.712767 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Sep 9 00:11:44.712772 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Sep 9 00:11:44.712778 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Sep 9 00:11:44.712783 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Sep 9 00:11:44.712789 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Sep 9 00:11:44.712794 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Sep 9 00:11:44.712800 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Sep 9 00:11:44.712805 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Sep 9 00:11:44.712811 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Sep 9 00:11:44.712817 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Sep 9 00:11:44.712823 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Sep 9 00:11:44.712828 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Sep 9 00:11:44.712834 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Sep 9 00:11:44.712839 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Sep 9 00:11:44.712844 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Sep 9 00:11:44.712850 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Sep 9 00:11:44.713394 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Sep 9 00:11:44.713403 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Sep 9 00:11:44.713409 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Sep 9 00:11:44.713417 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Sep 9 00:11:44.713422 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Sep 9 00:11:44.713428 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Sep 9 00:11:44.713433 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Sep 9 00:11:44.713439 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Sep 9 00:11:44.713444 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Sep 9 00:11:44.713450 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Sep 9 00:11:44.713455 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Sep 9 00:11:44.713461 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Sep 9 00:11:44.713466 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Sep 9 00:11:44.713473 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Sep 9 00:11:44.713478 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Sep 9 00:11:44.713484 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Sep 9 00:11:44.713489 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Sep 9 00:11:44.713495 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Sep 9 00:11:44.713500 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Sep 9 00:11:44.713505 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Sep 9 00:11:44.713511 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Sep 9 00:11:44.713516 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Sep 9 00:11:44.713522 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Sep 9 00:11:44.713528 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Sep 9 00:11:44.713534 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Sep 9 00:11:44.713539 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Sep 9 00:11:44.713545 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Sep 9 00:11:44.713550 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Sep 9 00:11:44.713556 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Sep 9 00:11:44.713561 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Sep 9 00:11:44.713567 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Sep 9 00:11:44.713572 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Sep 9 00:11:44.713578 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Sep 9 00:11:44.713584 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Sep 9 00:11:44.713589 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Sep 9 00:11:44.713595 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Sep 9 00:11:44.713600 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Sep 9 00:11:44.713605 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Sep 9 00:11:44.713611 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Sep 9 00:11:44.713616 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Sep 9 00:11:44.713622 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Sep 9 00:11:44.713628 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Sep 9 00:11:44.713639 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Sep 9 00:11:44.713648 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Sep 9 00:11:44.713653 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Sep 9 00:11:44.713659 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Sep 9 00:11:44.713664 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Sep 9 00:11:44.713670 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Sep 9 00:11:44.713675 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Sep 9 00:11:44.713680 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Sep 9 00:11:44.713686 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Sep 9 00:11:44.713691 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Sep 9 00:11:44.713698 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Sep 9 00:11:44.713703 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Sep 9 00:11:44.713709 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Sep 9 00:11:44.713714 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Sep 9 00:11:44.713720 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Sep 9 00:11:44.713725 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Sep 9 00:11:44.713731 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Sep 9 00:11:44.713736 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Sep 9 00:11:44.713741 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Sep 9 00:11:44.713747 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Sep 9 00:11:44.713754 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 00:11:44.713759 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Sep 9 00:11:44.713765 kernel: TSC deadline timer available Sep 9 00:11:44.713770 kernel: CPU topo: Max. logical packages: 128 Sep 9 00:11:44.713776 kernel: CPU topo: Max. logical dies: 128 Sep 9 00:11:44.713781 kernel: CPU topo: Max. dies per package: 1 Sep 9 00:11:44.713787 kernel: CPU topo: Max. threads per core: 1 Sep 9 00:11:44.713792 kernel: CPU topo: Num. cores per package: 1 Sep 9 00:11:44.713798 kernel: CPU topo: Num. threads per package: 1 Sep 9 00:11:44.713803 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Sep 9 00:11:44.713809 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Sep 9 00:11:44.713815 kernel: Booting paravirtualized kernel on VMware hypervisor Sep 9 00:11:44.713821 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 00:11:44.713826 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Sep 9 00:11:44.713832 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 9 00:11:44.713838 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 9 00:11:44.713843 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Sep 9 00:11:44.713849 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Sep 9 00:11:44.713870 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Sep 9 00:11:44.713877 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Sep 9 00:11:44.713883 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Sep 9 00:11:44.713888 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Sep 9 00:11:44.713893 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Sep 9 00:11:44.713899 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Sep 9 00:11:44.713905 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Sep 9 00:11:44.713910 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Sep 9 00:11:44.713916 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Sep 9 00:11:44.713923 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Sep 9 00:11:44.713929 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Sep 9 00:11:44.713934 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Sep 9 00:11:44.713939 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Sep 9 00:11:44.713945 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Sep 9 00:11:44.713951 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=c495f73c03808403ea4f55eb54c843aae6678d256d64068b1371f8afce28979a Sep 9 00:11:44.713957 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 00:11:44.713963 kernel: random: crng init done Sep 9 00:11:44.713969 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Sep 9 00:11:44.713975 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Sep 9 00:11:44.713980 kernel: printk: log_buf_len min size: 262144 bytes Sep 9 00:11:44.713986 kernel: printk: log_buf_len: 1048576 bytes Sep 9 00:11:44.713991 kernel: printk: early log buf free: 245576(93%) Sep 9 00:11:44.713997 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 00:11:44.714003 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 9 00:11:44.714008 kernel: Fallback order for Node 0: 0 Sep 9 00:11:44.714014 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Sep 9 00:11:44.714021 kernel: Policy zone: DMA32 Sep 9 00:11:44.714026 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 00:11:44.714032 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Sep 9 00:11:44.714037 kernel: ftrace: allocating 40099 entries in 157 pages Sep 9 00:11:44.714043 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 00:11:44.714048 kernel: Dynamic Preempt: voluntary Sep 9 00:11:44.714054 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 00:11:44.714060 kernel: rcu: RCU event tracing is enabled. Sep 9 00:11:44.714066 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Sep 9 00:11:44.714071 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 00:11:44.714078 kernel: Rude variant of Tasks RCU enabled. Sep 9 00:11:44.714084 kernel: Tracing variant of Tasks RCU enabled. Sep 9 00:11:44.714089 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 00:11:44.714095 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Sep 9 00:11:44.714100 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 9 00:11:44.714106 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 9 00:11:44.714111 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 9 00:11:44.714117 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Sep 9 00:11:44.714123 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Sep 9 00:11:44.714129 kernel: Console: colour VGA+ 80x25 Sep 9 00:11:44.714135 kernel: printk: legacy console [tty0] enabled Sep 9 00:11:44.714140 kernel: printk: legacy console [ttyS0] enabled Sep 9 00:11:44.714146 kernel: ACPI: Core revision 20240827 Sep 9 00:11:44.714152 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Sep 9 00:11:44.714157 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 00:11:44.714163 kernel: x2apic enabled Sep 9 00:11:44.714168 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 00:11:44.714174 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 9 00:11:44.714181 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 9 00:11:44.714187 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Sep 9 00:11:44.714193 kernel: Disabled fast string operations Sep 9 00:11:44.714198 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 9 00:11:44.714204 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 9 00:11:44.714209 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 00:11:44.714215 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 9 00:11:44.714221 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 9 00:11:44.714227 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 9 00:11:44.714234 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 9 00:11:44.714239 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 00:11:44.714245 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 00:11:44.714250 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 9 00:11:44.714256 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 9 00:11:44.714262 kernel: GDS: Unknown: Dependent on hypervisor status Sep 9 00:11:44.714267 kernel: active return thunk: its_return_thunk Sep 9 00:11:44.714273 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 9 00:11:44.714279 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 00:11:44.714286 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 00:11:44.714291 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 00:11:44.714297 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 00:11:44.714303 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 9 00:11:44.714308 kernel: Freeing SMP alternatives memory: 32K Sep 9 00:11:44.714314 kernel: pid_max: default: 131072 minimum: 1024 Sep 9 00:11:44.714320 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 00:11:44.714325 kernel: landlock: Up and running. Sep 9 00:11:44.714331 kernel: SELinux: Initializing. Sep 9 00:11:44.714338 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 00:11:44.714343 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 00:11:44.714349 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 9 00:11:44.714355 kernel: Performance Events: Skylake events, core PMU driver. Sep 9 00:11:44.714361 kernel: core: CPUID marked event: 'cpu cycles' unavailable Sep 9 00:11:44.714366 kernel: core: CPUID marked event: 'instructions' unavailable Sep 9 00:11:44.714372 kernel: core: CPUID marked event: 'bus cycles' unavailable Sep 9 00:11:44.714378 kernel: core: CPUID marked event: 'cache references' unavailable Sep 9 00:11:44.714384 kernel: core: CPUID marked event: 'cache misses' unavailable Sep 9 00:11:44.714390 kernel: core: CPUID marked event: 'branch instructions' unavailable Sep 9 00:11:44.714395 kernel: core: CPUID marked event: 'branch misses' unavailable Sep 9 00:11:44.714401 kernel: ... version: 1 Sep 9 00:11:44.714407 kernel: ... bit width: 48 Sep 9 00:11:44.714412 kernel: ... generic registers: 4 Sep 9 00:11:44.714418 kernel: ... value mask: 0000ffffffffffff Sep 9 00:11:44.714423 kernel: ... max period: 000000007fffffff Sep 9 00:11:44.714429 kernel: ... fixed-purpose events: 0 Sep 9 00:11:44.714436 kernel: ... event mask: 000000000000000f Sep 9 00:11:44.714442 kernel: signal: max sigframe size: 1776 Sep 9 00:11:44.714448 kernel: rcu: Hierarchical SRCU implementation. Sep 9 00:11:44.714453 kernel: rcu: Max phase no-delay instances is 400. Sep 9 00:11:44.714459 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Sep 9 00:11:44.714465 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 9 00:11:44.714470 kernel: smp: Bringing up secondary CPUs ... Sep 9 00:11:44.714476 kernel: smpboot: x86: Booting SMP configuration: Sep 9 00:11:44.714482 kernel: .... node #0, CPUs: #1 Sep 9 00:11:44.714487 kernel: Disabled fast string operations Sep 9 00:11:44.714494 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 00:11:44.714500 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Sep 9 00:11:44.714505 kernel: Memory: 1926288K/2096628K available (14336K kernel code, 2428K rwdata, 9956K rodata, 53832K init, 1088K bss, 158968K reserved, 0K cma-reserved) Sep 9 00:11:44.714511 kernel: devtmpfs: initialized Sep 9 00:11:44.714517 kernel: x86/mm: Memory block size: 128MB Sep 9 00:11:44.714522 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Sep 9 00:11:44.714528 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 00:11:44.714534 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Sep 9 00:11:44.714539 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 00:11:44.714546 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 00:11:44.714552 kernel: audit: initializing netlink subsys (disabled) Sep 9 00:11:44.716663 kernel: audit: type=2000 audit(1757376701.280:1): state=initialized audit_enabled=0 res=1 Sep 9 00:11:44.716673 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 00:11:44.716679 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 00:11:44.716684 kernel: cpuidle: using governor menu Sep 9 00:11:44.716690 kernel: Simple Boot Flag at 0x36 set to 0x80 Sep 9 00:11:44.716696 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 00:11:44.716702 kernel: dca service started, version 1.12.1 Sep 9 00:11:44.716716 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Sep 9 00:11:44.716723 kernel: PCI: Using configuration type 1 for base access Sep 9 00:11:44.716730 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 00:11:44.716736 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 00:11:44.716742 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 00:11:44.716748 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 00:11:44.716754 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 00:11:44.716760 kernel: ACPI: Added _OSI(Module Device) Sep 9 00:11:44.716768 kernel: ACPI: Added _OSI(Processor Device) Sep 9 00:11:44.716774 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 00:11:44.716780 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 00:11:44.716786 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 9 00:11:44.716792 kernel: ACPI: Interpreter enabled Sep 9 00:11:44.716798 kernel: ACPI: PM: (supports S0 S1 S5) Sep 9 00:11:44.716804 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 00:11:44.716810 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 00:11:44.716816 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 00:11:44.716823 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Sep 9 00:11:44.716829 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Sep 9 00:11:44.716927 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 00:11:44.716982 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Sep 9 00:11:44.717032 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Sep 9 00:11:44.717041 kernel: PCI host bridge to bus 0000:00 Sep 9 00:11:44.717094 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 00:11:44.717143 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Sep 9 00:11:44.717187 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 9 00:11:44.717231 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 00:11:44.717274 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Sep 9 00:11:44.717317 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Sep 9 00:11:44.717375 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Sep 9 00:11:44.717437 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Sep 9 00:11:44.717489 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 9 00:11:44.717546 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 9 00:11:44.717602 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Sep 9 00:11:44.717667 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Sep 9 00:11:44.717720 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Sep 9 00:11:44.717770 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Sep 9 00:11:44.717819 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Sep 9 00:11:44.717886 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Sep 9 00:11:44.717943 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 9 00:11:44.717994 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Sep 9 00:11:44.718047 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Sep 9 00:11:44.718102 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Sep 9 00:11:44.718159 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Sep 9 00:11:44.718220 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Sep 9 00:11:44.718282 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Sep 9 00:11:44.718333 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Sep 9 00:11:44.718386 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Sep 9 00:11:44.718435 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Sep 9 00:11:44.718485 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Sep 9 00:11:44.718533 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 00:11:44.718589 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Sep 9 00:11:44.718648 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 9 00:11:44.718707 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 9 00:11:44.718760 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 9 00:11:44.718810 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 9 00:11:44.719641 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.719737 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 9 00:11:44.719826 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 9 00:11:44.719920 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 9 00:11:44.720006 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.720095 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.720182 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 9 00:11:44.720263 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 9 00:11:44.720344 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 9 00:11:44.720428 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 9 00:11:44.720509 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.720603 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.720691 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 9 00:11:44.720772 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 9 00:11:44.722142 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 9 00:11:44.722248 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 9 00:11:44.722339 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.722433 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.722525 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 9 00:11:44.722611 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 9 00:11:44.722694 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 9 00:11:44.722776 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.722880 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.722973 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 9 00:11:44.723072 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 9 00:11:44.723159 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 9 00:11:44.723240 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.723328 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.723412 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 9 00:11:44.723492 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 9 00:11:44.723571 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 9 00:11:44.723656 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.723742 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.723826 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 9 00:11:44.725919 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 9 00:11:44.725978 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 9 00:11:44.726032 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.726099 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.726153 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 9 00:11:44.726204 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 9 00:11:44.726258 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 9 00:11:44.726308 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.726367 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.726419 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 9 00:11:44.726469 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 9 00:11:44.726519 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 9 00:11:44.726570 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.726627 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.726679 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 9 00:11:44.726729 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 9 00:11:44.726779 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 9 00:11:44.726829 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 9 00:11:44.728089 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.728155 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.728214 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 9 00:11:44.728271 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 9 00:11:44.728323 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 9 00:11:44.728373 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 9 00:11:44.728423 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.728478 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.728530 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 9 00:11:44.728582 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 9 00:11:44.728632 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 9 00:11:44.728682 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.728739 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.728790 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 9 00:11:44.728840 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 9 00:11:44.729376 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 9 00:11:44.729434 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.729491 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.729544 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 9 00:11:44.729595 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 9 00:11:44.729645 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 9 00:11:44.729695 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.729749 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.729802 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 9 00:11:44.729852 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 9 00:11:44.729920 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 9 00:11:44.729973 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.730029 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.730080 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 9 00:11:44.730130 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 9 00:11:44.730179 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 9 00:11:44.730231 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.730285 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.730336 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 9 00:11:44.730386 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 9 00:11:44.730437 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 9 00:11:44.732932 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 9 00:11:44.733003 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.733072 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.733125 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 9 00:11:44.733176 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 9 00:11:44.733229 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 9 00:11:44.733279 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 9 00:11:44.733329 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.733384 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.733436 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 9 00:11:44.733486 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 9 00:11:44.733535 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 9 00:11:44.733588 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 9 00:11:44.733638 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.733692 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.733743 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 9 00:11:44.733793 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 9 00:11:44.733844 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 9 00:11:44.733917 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.733973 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.734036 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 9 00:11:44.734099 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 9 00:11:44.734152 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 9 00:11:44.734202 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.734263 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.734317 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 9 00:11:44.734367 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 9 00:11:44.734420 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 9 00:11:44.734470 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.734526 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.734577 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 9 00:11:44.734627 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 9 00:11:44.734678 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 9 00:11:44.734728 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.734786 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.734837 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 9 00:11:44.734899 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 9 00:11:44.734950 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 9 00:11:44.735000 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.735056 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.735107 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 9 00:11:44.735161 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 9 00:11:44.735211 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 9 00:11:44.735265 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 9 00:11:44.735316 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.735372 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.735422 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 9 00:11:44.735472 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 9 00:11:44.735524 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 9 00:11:44.735574 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 9 00:11:44.735623 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.735680 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.735732 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 9 00:11:44.735782 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 9 00:11:44.735832 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 9 00:11:44.737907 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.737965 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.738030 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 9 00:11:44.738088 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 9 00:11:44.738139 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 9 00:11:44.738200 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.738261 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.738321 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 9 00:11:44.738906 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 9 00:11:44.738964 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 9 00:11:44.739020 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.739076 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.739128 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 9 00:11:44.739178 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 9 00:11:44.739231 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 9 00:11:44.739280 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.739338 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.739388 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 9 00:11:44.739439 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 9 00:11:44.739488 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 9 00:11:44.739538 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.739593 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 9 00:11:44.739647 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 9 00:11:44.739698 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 9 00:11:44.739748 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 9 00:11:44.739799 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.739852 kernel: pci_bus 0000:01: extended config space not accessible Sep 9 00:11:44.742198 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 9 00:11:44.742252 kernel: pci_bus 0000:02: extended config space not accessible Sep 9 00:11:44.742265 kernel: acpiphp: Slot [32] registered Sep 9 00:11:44.742271 kernel: acpiphp: Slot [33] registered Sep 9 00:11:44.742277 kernel: acpiphp: Slot [34] registered Sep 9 00:11:44.742283 kernel: acpiphp: Slot [35] registered Sep 9 00:11:44.742289 kernel: acpiphp: Slot [36] registered Sep 9 00:11:44.742295 kernel: acpiphp: Slot [37] registered Sep 9 00:11:44.742301 kernel: acpiphp: Slot [38] registered Sep 9 00:11:44.742306 kernel: acpiphp: Slot [39] registered Sep 9 00:11:44.742312 kernel: acpiphp: Slot [40] registered Sep 9 00:11:44.742319 kernel: acpiphp: Slot [41] registered Sep 9 00:11:44.742325 kernel: acpiphp: Slot [42] registered Sep 9 00:11:44.742331 kernel: acpiphp: Slot [43] registered Sep 9 00:11:44.742337 kernel: acpiphp: Slot [44] registered Sep 9 00:11:44.742343 kernel: acpiphp: Slot [45] registered Sep 9 00:11:44.742349 kernel: acpiphp: Slot [46] registered Sep 9 00:11:44.742354 kernel: acpiphp: Slot [47] registered Sep 9 00:11:44.742360 kernel: acpiphp: Slot [48] registered Sep 9 00:11:44.742366 kernel: acpiphp: Slot [49] registered Sep 9 00:11:44.742373 kernel: acpiphp: Slot [50] registered Sep 9 00:11:44.742379 kernel: acpiphp: Slot [51] registered Sep 9 00:11:44.742385 kernel: acpiphp: Slot [52] registered Sep 9 00:11:44.742391 kernel: acpiphp: Slot [53] registered Sep 9 00:11:44.742397 kernel: acpiphp: Slot [54] registered Sep 9 00:11:44.742402 kernel: acpiphp: Slot [55] registered Sep 9 00:11:44.742408 kernel: acpiphp: Slot [56] registered Sep 9 00:11:44.742414 kernel: acpiphp: Slot [57] registered Sep 9 00:11:44.742420 kernel: acpiphp: Slot [58] registered Sep 9 00:11:44.742426 kernel: acpiphp: Slot [59] registered Sep 9 00:11:44.742433 kernel: acpiphp: Slot [60] registered Sep 9 00:11:44.742439 kernel: acpiphp: Slot [61] registered Sep 9 00:11:44.742445 kernel: acpiphp: Slot [62] registered Sep 9 00:11:44.742451 kernel: acpiphp: Slot [63] registered Sep 9 00:11:44.742503 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 9 00:11:44.742554 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Sep 9 00:11:44.742604 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Sep 9 00:11:44.742654 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Sep 9 00:11:44.742706 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Sep 9 00:11:44.742755 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Sep 9 00:11:44.742816 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Sep 9 00:11:44.742883 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Sep 9 00:11:44.742938 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Sep 9 00:11:44.742989 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 9 00:11:44.743040 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 9 00:11:44.743092 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 9 00:11:44.743147 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 9 00:11:44.743200 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 9 00:11:44.743252 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 9 00:11:44.743305 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 9 00:11:44.743358 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 9 00:11:44.743412 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 9 00:11:44.743464 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 9 00:11:44.743521 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 9 00:11:44.743578 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Sep 9 00:11:44.743631 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Sep 9 00:11:44.743683 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Sep 9 00:11:44.743734 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Sep 9 00:11:44.743785 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Sep 9 00:11:44.743835 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 9 00:11:44.743898 kernel: pci 0000:0b:00.0: supports D1 D2 Sep 9 00:11:44.743949 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 9 00:11:44.744001 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 9 00:11:44.744053 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 9 00:11:44.744105 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 9 00:11:44.744158 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 9 00:11:44.744212 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 9 00:11:44.744267 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 9 00:11:44.744323 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 9 00:11:44.744375 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 9 00:11:44.744426 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 9 00:11:44.744477 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 9 00:11:44.744529 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 9 00:11:44.744592 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 9 00:11:44.744644 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 9 00:11:44.744699 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 9 00:11:44.744752 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 9 00:11:44.744803 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 9 00:11:44.744863 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 9 00:11:44.744929 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 9 00:11:44.744992 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 9 00:11:44.745044 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 9 00:11:44.745099 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 9 00:11:44.745149 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 9 00:11:44.745200 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 9 00:11:44.745250 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 9 00:11:44.745307 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 9 00:11:44.745316 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Sep 9 00:11:44.745322 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Sep 9 00:11:44.745329 kernel: ACPI: PCI: Interrupt link LNKB disabled Sep 9 00:11:44.745336 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 00:11:44.745342 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Sep 9 00:11:44.745349 kernel: iommu: Default domain type: Translated Sep 9 00:11:44.745355 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 00:11:44.745361 kernel: PCI: Using ACPI for IRQ routing Sep 9 00:11:44.745367 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 00:11:44.745373 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Sep 9 00:11:44.745379 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Sep 9 00:11:44.745429 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Sep 9 00:11:44.745481 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Sep 9 00:11:44.745531 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 00:11:44.745540 kernel: vgaarb: loaded Sep 9 00:11:44.745546 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Sep 9 00:11:44.745552 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Sep 9 00:11:44.745558 kernel: clocksource: Switched to clocksource tsc-early Sep 9 00:11:44.745564 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 00:11:44.745570 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 00:11:44.745578 kernel: pnp: PnP ACPI init Sep 9 00:11:44.745632 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Sep 9 00:11:44.745680 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Sep 9 00:11:44.745726 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Sep 9 00:11:44.745777 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Sep 9 00:11:44.745891 kernel: pnp 00:06: [dma 2] Sep 9 00:11:44.745946 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Sep 9 00:11:44.745995 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Sep 9 00:11:44.746040 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Sep 9 00:11:44.746049 kernel: pnp: PnP ACPI: found 8 devices Sep 9 00:11:44.746055 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 00:11:44.746061 kernel: NET: Registered PF_INET protocol family Sep 9 00:11:44.746067 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 00:11:44.746073 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 9 00:11:44.746079 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 00:11:44.746087 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 00:11:44.746093 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 00:11:44.746099 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 9 00:11:44.746105 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 00:11:44.746111 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 00:11:44.746117 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 00:11:44.746123 kernel: NET: Registered PF_XDP protocol family Sep 9 00:11:44.746174 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Sep 9 00:11:44.746227 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 9 00:11:44.746280 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 9 00:11:44.746332 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 9 00:11:44.746383 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 9 00:11:44.746433 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Sep 9 00:11:44.746484 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Sep 9 00:11:44.746534 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Sep 9 00:11:44.746585 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Sep 9 00:11:44.746639 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Sep 9 00:11:44.746690 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Sep 9 00:11:44.746741 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Sep 9 00:11:44.746791 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Sep 9 00:11:44.746841 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Sep 9 00:11:44.746909 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Sep 9 00:11:44.746960 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Sep 9 00:11:44.747012 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Sep 9 00:11:44.747064 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Sep 9 00:11:44.747115 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Sep 9 00:11:44.747166 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Sep 9 00:11:44.747216 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Sep 9 00:11:44.747265 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Sep 9 00:11:44.747315 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Sep 9 00:11:44.747366 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Sep 9 00:11:44.747416 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Sep 9 00:11:44.747468 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.747518 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.747568 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.747618 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.747667 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.747717 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.747766 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.747815 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.747875 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.747934 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.747986 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748036 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748087 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748136 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748187 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748239 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748293 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748343 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748393 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748442 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748492 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748542 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748592 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748644 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748694 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748744 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748794 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748843 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.748901 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.748952 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749002 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749054 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749104 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749154 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749204 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749253 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749303 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749352 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749403 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749454 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749504 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749554 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749603 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749653 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749702 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749752 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.749801 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.749850 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.750049 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.750100 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.750151 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.750416 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.750480 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.750532 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.752950 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753006 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753061 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753117 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753170 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753221 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753273 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753324 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753376 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753426 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753477 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753528 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753579 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753633 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753685 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753736 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753787 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753838 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.753902 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.753952 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.754007 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.754058 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.754470 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.754536 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.754592 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.754645 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.754698 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.754749 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.754804 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 9 00:11:44.754900 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 9 00:11:44.754959 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 9 00:11:44.755011 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Sep 9 00:11:44.755061 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 9 00:11:44.755111 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 9 00:11:44.755161 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 9 00:11:44.755214 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Sep 9 00:11:44.755277 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 9 00:11:44.755330 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 9 00:11:44.755380 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 9 00:11:44.755430 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Sep 9 00:11:44.755483 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 9 00:11:44.755534 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 9 00:11:44.755584 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 9 00:11:44.755634 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 9 00:11:44.755687 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 9 00:11:44.755738 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 9 00:11:44.755789 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 9 00:11:44.755840 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 9 00:11:44.755939 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 9 00:11:44.755992 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 9 00:11:44.756043 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 9 00:11:44.756094 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 9 00:11:44.756144 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 9 00:11:44.756194 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 9 00:11:44.756248 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 9 00:11:44.756298 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 9 00:11:44.756348 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 9 00:11:44.756400 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 9 00:11:44.756450 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 9 00:11:44.756500 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 9 00:11:44.756551 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 9 00:11:44.756603 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 9 00:11:44.756653 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 9 00:11:44.756708 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Sep 9 00:11:44.756760 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 9 00:11:44.756811 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 9 00:11:44.756882 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 9 00:11:44.756935 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Sep 9 00:11:44.756988 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 9 00:11:44.757041 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 9 00:11:44.757091 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 9 00:11:44.757141 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 9 00:11:44.757192 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 9 00:11:44.757242 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 9 00:11:44.757292 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 9 00:11:44.757341 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 9 00:11:44.757393 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 9 00:11:44.757443 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 9 00:11:44.757493 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 9 00:11:44.757547 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 9 00:11:44.757596 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 9 00:11:44.757646 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 9 00:11:44.757697 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 9 00:11:44.757748 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 9 00:11:44.757799 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 9 00:11:44.757850 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 9 00:11:44.757923 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 9 00:11:44.757974 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 9 00:11:44.758025 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 9 00:11:44.758075 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 9 00:11:44.758125 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 9 00:11:44.758177 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 9 00:11:44.758227 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 9 00:11:44.758288 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 9 00:11:44.758340 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 9 00:11:44.758393 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 9 00:11:44.758914 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 9 00:11:44.758972 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 9 00:11:44.759025 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 9 00:11:44.759079 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 9 00:11:44.759129 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 9 00:11:44.759180 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 9 00:11:44.759232 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 9 00:11:44.759286 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 9 00:11:44.759336 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 9 00:11:44.759389 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 9 00:11:44.759440 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 9 00:11:44.759489 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 9 00:11:44.759539 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 9 00:11:44.759590 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 9 00:11:44.759640 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 9 00:11:44.759692 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 9 00:11:44.759744 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 9 00:11:44.759793 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 9 00:11:44.759843 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 9 00:11:44.759917 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 9 00:11:44.759970 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 9 00:11:44.760021 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 9 00:11:44.760076 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 9 00:11:44.760127 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 9 00:11:44.760177 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 9 00:11:44.760232 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 9 00:11:44.760283 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 9 00:11:44.760333 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 9 00:11:44.760396 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 9 00:11:44.760466 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 9 00:11:44.760519 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 9 00:11:44.760572 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 9 00:11:44.760622 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 9 00:11:44.760674 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 9 00:11:44.760725 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 9 00:11:44.760774 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 9 00:11:44.760825 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 9 00:11:44.760897 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 9 00:11:44.760948 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 9 00:11:44.761003 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 9 00:11:44.761052 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 9 00:11:44.761102 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 9 00:11:44.761154 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 9 00:11:44.761205 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 9 00:11:44.761255 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 9 00:11:44.761313 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 9 00:11:44.761363 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 9 00:11:44.761413 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 9 00:11:44.761463 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Sep 9 00:11:44.761531 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 9 00:11:44.761578 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 9 00:11:44.761622 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Sep 9 00:11:44.761667 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Sep 9 00:11:44.761719 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Sep 9 00:11:44.761766 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Sep 9 00:11:44.761811 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 9 00:11:44.761865 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Sep 9 00:11:44.761925 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 9 00:11:44.761976 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 9 00:11:44.762023 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Sep 9 00:11:44.762070 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Sep 9 00:11:44.762122 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Sep 9 00:11:44.762169 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Sep 9 00:11:44.762214 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Sep 9 00:11:44.762266 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Sep 9 00:11:44.762312 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Sep 9 00:11:44.762358 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Sep 9 00:11:44.762410 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Sep 9 00:11:44.762456 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Sep 9 00:11:44.762501 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Sep 9 00:11:44.762550 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Sep 9 00:11:44.762596 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Sep 9 00:11:44.762645 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Sep 9 00:11:44.762692 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 9 00:11:44.762745 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Sep 9 00:11:44.762791 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Sep 9 00:11:44.762842 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Sep 9 00:11:44.762917 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Sep 9 00:11:44.762969 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Sep 9 00:11:44.763014 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Sep 9 00:11:44.763068 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Sep 9 00:11:44.763114 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Sep 9 00:11:44.763159 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Sep 9 00:11:44.763210 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Sep 9 00:11:44.763256 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Sep 9 00:11:44.763304 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Sep 9 00:11:44.763354 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Sep 9 00:11:44.763401 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Sep 9 00:11:44.763446 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Sep 9 00:11:44.763497 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Sep 9 00:11:44.763543 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 9 00:11:44.763592 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Sep 9 00:11:44.763641 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 9 00:11:44.763693 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Sep 9 00:11:44.763739 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Sep 9 00:11:44.763792 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Sep 9 00:11:44.763838 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Sep 9 00:11:44.764375 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Sep 9 00:11:44.764432 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 9 00:11:44.764486 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Sep 9 00:11:44.764534 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Sep 9 00:11:44.764580 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 9 00:11:44.764631 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Sep 9 00:11:44.764677 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Sep 9 00:11:44.764722 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Sep 9 00:11:44.764777 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Sep 9 00:11:44.764823 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Sep 9 00:11:44.765444 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Sep 9 00:11:44.765505 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Sep 9 00:11:44.765555 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 9 00:11:44.765607 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Sep 9 00:11:44.765657 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 9 00:11:44.765708 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Sep 9 00:11:44.765755 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Sep 9 00:11:44.765808 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Sep 9 00:11:44.765989 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Sep 9 00:11:44.766050 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Sep 9 00:11:44.766101 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 9 00:11:44.766152 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Sep 9 00:11:44.766199 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Sep 9 00:11:44.766245 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Sep 9 00:11:44.766296 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Sep 9 00:11:44.766343 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Sep 9 00:11:44.766388 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Sep 9 00:11:44.766442 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Sep 9 00:11:44.766488 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Sep 9 00:11:44.766539 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Sep 9 00:11:44.766586 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 9 00:11:44.766636 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Sep 9 00:11:44.766682 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Sep 9 00:11:44.766736 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Sep 9 00:11:44.766782 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Sep 9 00:11:44.766833 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Sep 9 00:11:44.766897 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Sep 9 00:11:44.766950 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Sep 9 00:11:44.766997 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 9 00:11:44.767056 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 9 00:11:44.767068 kernel: PCI: CLS 32 bytes, default 64 Sep 9 00:11:44.767074 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 9 00:11:44.767081 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 9 00:11:44.767087 kernel: clocksource: Switched to clocksource tsc Sep 9 00:11:44.767093 kernel: Initialise system trusted keyrings Sep 9 00:11:44.767099 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 9 00:11:44.767105 kernel: Key type asymmetric registered Sep 9 00:11:44.767111 kernel: Asymmetric key parser 'x509' registered Sep 9 00:11:44.767119 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 00:11:44.767125 kernel: io scheduler mq-deadline registered Sep 9 00:11:44.767131 kernel: io scheduler kyber registered Sep 9 00:11:44.767137 kernel: io scheduler bfq registered Sep 9 00:11:44.767189 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Sep 9 00:11:44.767242 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.767295 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Sep 9 00:11:44.767346 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.767401 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Sep 9 00:11:44.767452 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.767504 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Sep 9 00:11:44.767556 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.767609 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Sep 9 00:11:44.767659 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.767712 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Sep 9 00:11:44.767765 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.767817 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Sep 9 00:11:44.767886 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.767941 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Sep 9 00:11:44.767991 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.768042 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Sep 9 00:11:44.768093 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.768148 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Sep 9 00:11:44.768198 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.768249 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Sep 9 00:11:44.768310 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.768363 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Sep 9 00:11:44.768424 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.768479 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Sep 9 00:11:44.768530 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.768584 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Sep 9 00:11:44.768635 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.768687 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Sep 9 00:11:44.768744 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.768796 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Sep 9 00:11:44.768849 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.769122 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Sep 9 00:11:44.769182 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.769236 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Sep 9 00:11:44.769287 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.769346 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Sep 9 00:11:44.769398 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.769451 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Sep 9 00:11:44.769504 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.769556 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Sep 9 00:11:44.769610 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.769663 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Sep 9 00:11:44.769714 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.769767 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Sep 9 00:11:44.769818 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.769891 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Sep 9 00:11:44.769945 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770001 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Sep 9 00:11:44.770052 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770104 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Sep 9 00:11:44.770155 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770206 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Sep 9 00:11:44.770258 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770311 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Sep 9 00:11:44.770361 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770415 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Sep 9 00:11:44.770467 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770519 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Sep 9 00:11:44.770570 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770623 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Sep 9 00:11:44.770674 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770725 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Sep 9 00:11:44.770779 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 9 00:11:44.770789 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 00:11:44.770796 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 00:11:44.770802 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 00:11:44.770809 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Sep 9 00:11:44.770816 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 00:11:44.770822 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 00:11:44.771238 kernel: rtc_cmos 00:01: registered as rtc0 Sep 9 00:11:44.771298 kernel: rtc_cmos 00:01: setting system clock to 2025-09-09T00:11:44 UTC (1757376704) Sep 9 00:11:44.771308 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 9 00:11:44.771354 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Sep 9 00:11:44.771363 kernel: intel_pstate: CPU model not supported Sep 9 00:11:44.771369 kernel: NET: Registered PF_INET6 protocol family Sep 9 00:11:44.771376 kernel: Segment Routing with IPv6 Sep 9 00:11:44.771382 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 00:11:44.771389 kernel: NET: Registered PF_PACKET protocol family Sep 9 00:11:44.771397 kernel: Key type dns_resolver registered Sep 9 00:11:44.771403 kernel: IPI shorthand broadcast: enabled Sep 9 00:11:44.771410 kernel: sched_clock: Marking stable (2637003935, 170369420)->(2820994455, -13621100) Sep 9 00:11:44.771416 kernel: registered taskstats version 1 Sep 9 00:11:44.771423 kernel: Loading compiled-in X.509 certificates Sep 9 00:11:44.771429 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 08d0986253b18b7fd74c2cc5404da4ba92260e75' Sep 9 00:11:44.771435 kernel: Demotion targets for Node 0: null Sep 9 00:11:44.771442 kernel: Key type .fscrypt registered Sep 9 00:11:44.771448 kernel: Key type fscrypt-provisioning registered Sep 9 00:11:44.771455 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 00:11:44.771462 kernel: ima: Allocated hash algorithm: sha1 Sep 9 00:11:44.771468 kernel: ima: No architecture policies found Sep 9 00:11:44.771474 kernel: clk: Disabling unused clocks Sep 9 00:11:44.771482 kernel: Warning: unable to open an initial console. Sep 9 00:11:44.771489 kernel: Freeing unused kernel image (initmem) memory: 53832K Sep 9 00:11:44.771495 kernel: Write protecting the kernel read-only data: 24576k Sep 9 00:11:44.771501 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Sep 9 00:11:44.771509 kernel: Run /init as init process Sep 9 00:11:44.771515 kernel: with arguments: Sep 9 00:11:44.771521 kernel: /init Sep 9 00:11:44.771527 kernel: with environment: Sep 9 00:11:44.771533 kernel: HOME=/ Sep 9 00:11:44.771540 kernel: TERM=linux Sep 9 00:11:44.771546 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 00:11:44.771553 systemd[1]: Successfully made /usr/ read-only. Sep 9 00:11:44.771562 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 00:11:44.771570 systemd[1]: Detected virtualization vmware. Sep 9 00:11:44.771577 systemd[1]: Detected architecture x86-64. Sep 9 00:11:44.771583 systemd[1]: Running in initrd. Sep 9 00:11:44.771589 systemd[1]: No hostname configured, using default hostname. Sep 9 00:11:44.771596 systemd[1]: Hostname set to . Sep 9 00:11:44.771602 systemd[1]: Initializing machine ID from random generator. Sep 9 00:11:44.771609 systemd[1]: Queued start job for default target initrd.target. Sep 9 00:11:44.771616 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 00:11:44.771623 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 00:11:44.771631 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 00:11:44.771637 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 00:11:44.771644 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 00:11:44.771651 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 00:11:44.771658 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 00:11:44.771666 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 00:11:44.771672 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 00:11:44.771679 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 00:11:44.771685 systemd[1]: Reached target paths.target - Path Units. Sep 9 00:11:44.771692 systemd[1]: Reached target slices.target - Slice Units. Sep 9 00:11:44.771698 systemd[1]: Reached target swap.target - Swaps. Sep 9 00:11:44.771705 systemd[1]: Reached target timers.target - Timer Units. Sep 9 00:11:44.771712 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 00:11:44.771718 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 00:11:44.771726 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 00:11:44.771732 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 00:11:44.771739 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 00:11:44.771745 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 00:11:44.771752 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 00:11:44.771759 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 00:11:44.771765 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 00:11:44.771772 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 00:11:44.771780 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 00:11:44.771787 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 00:11:44.771793 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 00:11:44.771800 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 00:11:44.771806 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 00:11:44.771813 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 00:11:44.771831 systemd-journald[243]: Collecting audit messages is disabled. Sep 9 00:11:44.771850 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 00:11:44.771870 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 00:11:44.771885 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 00:11:44.771892 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 00:11:44.771899 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 00:11:44.771905 kernel: Bridge firewalling registered Sep 9 00:11:44.771912 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 00:11:44.771919 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 00:11:44.771925 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 00:11:44.771932 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 00:11:44.771940 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 00:11:44.771948 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 00:11:44.771955 systemd-journald[243]: Journal started Sep 9 00:11:44.771971 systemd-journald[243]: Runtime Journal (/run/log/journal/9d1e95dfd41f4ce9846736a7f6fa8639) is 4.8M, max 38.9M, 34M free. Sep 9 00:11:44.712619 systemd-modules-load[244]: Inserted module 'overlay' Sep 9 00:11:44.773406 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 00:11:44.738989 systemd-modules-load[244]: Inserted module 'br_netfilter' Sep 9 00:11:44.775874 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 00:11:44.776799 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 00:11:44.777839 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 00:11:44.779643 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 00:11:44.781022 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 00:11:44.796337 systemd-tmpfiles[279]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 00:11:44.798483 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 00:11:44.799839 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 00:11:44.803054 dracut-cmdline[278]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=c495f73c03808403ea4f55eb54c843aae6678d256d64068b1371f8afce28979a Sep 9 00:11:44.830160 systemd-resolved[291]: Positive Trust Anchors: Sep 9 00:11:44.830360 systemd-resolved[291]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 00:11:44.830383 systemd-resolved[291]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 00:11:44.832221 systemd-resolved[291]: Defaulting to hostname 'linux'. Sep 9 00:11:44.833368 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 00:11:44.833514 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 00:11:44.858874 kernel: SCSI subsystem initialized Sep 9 00:11:44.875874 kernel: Loading iSCSI transport class v2.0-870. Sep 9 00:11:44.883870 kernel: iscsi: registered transport (tcp) Sep 9 00:11:44.906879 kernel: iscsi: registered transport (qla4xxx) Sep 9 00:11:44.906922 kernel: QLogic iSCSI HBA Driver Sep 9 00:11:44.918332 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 00:11:44.937288 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 00:11:44.938403 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 00:11:44.961260 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 00:11:44.962234 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 00:11:44.998873 kernel: raid6: avx2x4 gen() 47151 MB/s Sep 9 00:11:45.015868 kernel: raid6: avx2x2 gen() 53323 MB/s Sep 9 00:11:45.033107 kernel: raid6: avx2x1 gen() 44575 MB/s Sep 9 00:11:45.033151 kernel: raid6: using algorithm avx2x2 gen() 53323 MB/s Sep 9 00:11:45.051083 kernel: raid6: .... xor() 31812 MB/s, rmw enabled Sep 9 00:11:45.051137 kernel: raid6: using avx2x2 recovery algorithm Sep 9 00:11:45.064870 kernel: xor: automatically using best checksumming function avx Sep 9 00:11:45.170879 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 00:11:45.174618 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 00:11:45.175745 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 00:11:45.193016 systemd-udevd[492]: Using default interface naming scheme 'v255'. Sep 9 00:11:45.196343 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 00:11:45.197531 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 00:11:45.215399 dracut-pre-trigger[498]: rd.md=0: removing MD RAID activation Sep 9 00:11:45.228415 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 00:11:45.229334 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 00:11:45.309285 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 00:11:45.310933 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 00:11:45.372874 kernel: VMware PVSCSI driver - version 1.0.7.0-k Sep 9 00:11:45.376875 kernel: vmw_pvscsi: using 64bit dma Sep 9 00:11:45.377886 kernel: vmw_pvscsi: max_id: 16 Sep 9 00:11:45.377898 kernel: vmw_pvscsi: setting ring_pages to 8 Sep 9 00:11:45.384037 kernel: vmw_pvscsi: enabling reqCallThreshold Sep 9 00:11:45.384060 kernel: vmw_pvscsi: driver-based request coalescing enabled Sep 9 00:11:45.384072 kernel: vmw_pvscsi: using MSI-X Sep 9 00:11:45.385554 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Sep 9 00:11:45.385908 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Sep 9 00:11:45.387887 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Sep 9 00:11:45.391295 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Sep 9 00:11:45.391312 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Sep 9 00:11:45.393329 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Sep 9 00:11:45.408920 (udev-worker)[551]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 9 00:11:45.413881 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Sep 9 00:11:45.420378 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 00:11:45.420454 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 00:11:45.421117 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 00:11:45.423067 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 00:11:45.427869 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 00:11:45.431867 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Sep 9 00:11:45.433881 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 9 00:11:45.433975 kernel: libata version 3.00 loaded. Sep 9 00:11:45.433988 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Sep 9 00:11:45.435864 kernel: sd 0:0:0:0: [sda] Cache data unavailable Sep 9 00:11:45.435945 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Sep 9 00:11:45.438884 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Sep 9 00:11:45.446788 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 00:11:45.451881 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 00:11:45.451902 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 9 00:11:45.457886 kernel: AES CTR mode by8 optimization enabled Sep 9 00:11:45.457908 kernel: ata_piix 0000:00:07.1: version 2.13 Sep 9 00:11:45.459188 kernel: scsi host1: ata_piix Sep 9 00:11:45.459269 kernel: scsi host2: ata_piix Sep 9 00:11:45.460536 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Sep 9 00:11:45.460557 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Sep 9 00:11:45.503088 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Sep 9 00:11:45.508419 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Sep 9 00:11:45.513882 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 9 00:11:45.518244 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Sep 9 00:11:45.518388 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Sep 9 00:11:45.519135 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 00:11:45.564879 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 00:11:45.636726 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Sep 9 00:11:45.641882 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Sep 9 00:11:45.682867 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Sep 9 00:11:45.682990 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 9 00:11:45.697939 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 9 00:11:46.035989 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 00:11:46.036311 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 00:11:46.036441 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 00:11:46.036642 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 00:11:46.037302 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 00:11:46.052026 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 00:11:46.584029 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 00:11:46.584066 disk-uuid[635]: The operation has completed successfully. Sep 9 00:11:46.623731 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 00:11:46.623968 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 00:11:46.634262 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 00:11:46.644102 sh[676]: Success Sep 9 00:11:46.658953 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 00:11:46.658976 kernel: device-mapper: uevent: version 1.0.3 Sep 9 00:11:46.660360 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 00:11:46.666875 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 9 00:11:46.713557 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 00:11:46.715914 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 00:11:46.724394 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 00:11:46.738483 kernel: BTRFS: device fsid c483a4f4-f0a7-42f4-ac8d-111955dab3a7 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (688) Sep 9 00:11:46.738504 kernel: BTRFS info (device dm-0): first mount of filesystem c483a4f4-f0a7-42f4-ac8d-111955dab3a7 Sep 9 00:11:46.738513 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 00:11:46.747209 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 9 00:11:46.747224 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 00:11:46.747232 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 00:11:46.749836 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 00:11:46.750125 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 00:11:46.750694 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Sep 9 00:11:46.751920 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 00:11:46.774875 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (711) Sep 9 00:11:46.776904 kernel: BTRFS info (device sda6): first mount of filesystem 1ca5876a-e169-4e15-a56e-4292fa8c609f Sep 9 00:11:46.776927 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 00:11:46.786868 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 00:11:46.786884 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 00:11:46.793866 kernel: BTRFS info (device sda6): last unmount of filesystem 1ca5876a-e169-4e15-a56e-4292fa8c609f Sep 9 00:11:46.794698 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 00:11:46.797032 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 00:11:46.824749 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 9 00:11:46.826687 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 00:11:46.915712 ignition[730]: Ignition 2.21.0 Sep 9 00:11:46.915883 ignition[730]: Stage: fetch-offline Sep 9 00:11:46.915912 ignition[730]: no configs at "/usr/lib/ignition/base.d" Sep 9 00:11:46.915918 ignition[730]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 9 00:11:46.915971 ignition[730]: parsed url from cmdline: "" Sep 9 00:11:46.915973 ignition[730]: no config URL provided Sep 9 00:11:46.915976 ignition[730]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 00:11:46.915980 ignition[730]: no config at "/usr/lib/ignition/user.ign" Sep 9 00:11:46.916352 ignition[730]: config successfully fetched Sep 9 00:11:46.916370 ignition[730]: parsing config with SHA512: 0e8cd9e63fa24a1efa366e8f770639fe446adbdc488979a0fceaf76ceb62b1f1cfabde6b75ce849a4acfa7fef795ae9e4f20d3712dc14113c68a79f8507c20e7 Sep 9 00:11:46.921009 unknown[730]: fetched base config from "system" Sep 9 00:11:46.921014 unknown[730]: fetched user config from "vmware" Sep 9 00:11:46.921504 ignition[730]: fetch-offline: fetch-offline passed Sep 9 00:11:46.921761 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 00:11:46.922909 ignition[730]: Ignition finished successfully Sep 9 00:11:46.923179 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 00:11:46.923723 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 00:11:46.944600 systemd-networkd[866]: lo: Link UP Sep 9 00:11:46.944608 systemd-networkd[866]: lo: Gained carrier Sep 9 00:11:46.945551 systemd-networkd[866]: Enumeration completed Sep 9 00:11:46.945688 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 00:11:46.945905 systemd-networkd[866]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Sep 9 00:11:46.946167 systemd[1]: Reached target network.target - Network. Sep 9 00:11:46.949365 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 9 00:11:46.949468 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 9 00:11:46.946260 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 00:11:46.946804 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 00:11:46.949254 systemd-networkd[866]: ens192: Link UP Sep 9 00:11:46.949256 systemd-networkd[866]: ens192: Gained carrier Sep 9 00:11:46.966599 ignition[870]: Ignition 2.21.0 Sep 9 00:11:46.966828 ignition[870]: Stage: kargs Sep 9 00:11:46.967034 ignition[870]: no configs at "/usr/lib/ignition/base.d" Sep 9 00:11:46.967040 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 9 00:11:46.967494 ignition[870]: kargs: kargs passed Sep 9 00:11:46.967517 ignition[870]: Ignition finished successfully Sep 9 00:11:46.969318 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 00:11:46.970093 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 00:11:46.982560 ignition[877]: Ignition 2.21.0 Sep 9 00:11:46.982569 ignition[877]: Stage: disks Sep 9 00:11:46.982651 ignition[877]: no configs at "/usr/lib/ignition/base.d" Sep 9 00:11:46.982657 ignition[877]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 9 00:11:46.983721 ignition[877]: disks: disks passed Sep 9 00:11:46.984025 ignition[877]: Ignition finished successfully Sep 9 00:11:46.984796 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 00:11:46.985321 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 00:11:46.985442 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 00:11:46.985554 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 00:11:46.985651 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 00:11:46.985747 systemd[1]: Reached target basic.target - Basic System. Sep 9 00:11:46.987001 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 00:11:47.012541 systemd-fsck[885]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 9 00:11:47.014374 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 00:11:47.015199 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 00:11:47.126899 kernel: EXT4-fs (sda9): mounted filesystem 4b59fff7-9272-4156-91f8-37989d927dc6 r/w with ordered data mode. Quota mode: none. Sep 9 00:11:47.127412 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 00:11:47.127978 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 00:11:47.130954 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 00:11:47.132903 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 00:11:47.133288 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 00:11:47.133459 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 00:11:47.133474 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 00:11:47.142314 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 00:11:47.143936 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 00:11:47.148953 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (893) Sep 9 00:11:47.151755 kernel: BTRFS info (device sda6): first mount of filesystem 1ca5876a-e169-4e15-a56e-4292fa8c609f Sep 9 00:11:47.151784 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 00:11:47.158875 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 00:11:47.158946 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 00:11:47.159813 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 00:11:47.191342 initrd-setup-root[917]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 00:11:47.195332 initrd-setup-root[924]: cut: /sysroot/etc/group: No such file or directory Sep 9 00:11:47.200005 initrd-setup-root[931]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 00:11:47.202193 initrd-setup-root[938]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 00:11:47.238063 systemd-resolved[291]: Detected conflict on linux IN A 139.178.70.104 Sep 9 00:11:47.238073 systemd-resolved[291]: Hostname conflict, changing published hostname from 'linux' to 'linux10'. Sep 9 00:11:47.264006 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 00:11:47.264771 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 00:11:47.265922 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 00:11:47.277875 kernel: BTRFS info (device sda6): last unmount of filesystem 1ca5876a-e169-4e15-a56e-4292fa8c609f Sep 9 00:11:47.293113 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 00:11:47.294204 ignition[1006]: INFO : Ignition 2.21.0 Sep 9 00:11:47.294427 ignition[1006]: INFO : Stage: mount Sep 9 00:11:47.294649 ignition[1006]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 00:11:47.294790 ignition[1006]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 9 00:11:47.295528 ignition[1006]: INFO : mount: mount passed Sep 9 00:11:47.295668 ignition[1006]: INFO : Ignition finished successfully Sep 9 00:11:47.296488 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 00:11:47.297158 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 00:11:47.736392 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 00:11:47.737341 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 00:11:47.753516 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1018) Sep 9 00:11:47.753553 kernel: BTRFS info (device sda6): first mount of filesystem 1ca5876a-e169-4e15-a56e-4292fa8c609f Sep 9 00:11:47.753567 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 00:11:47.758346 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 00:11:47.758367 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 00:11:47.759605 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 00:11:47.784368 ignition[1034]: INFO : Ignition 2.21.0 Sep 9 00:11:47.784368 ignition[1034]: INFO : Stage: files Sep 9 00:11:47.785625 ignition[1034]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 00:11:47.785625 ignition[1034]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 9 00:11:47.785625 ignition[1034]: DEBUG : files: compiled without relabeling support, skipping Sep 9 00:11:47.786554 ignition[1034]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 00:11:47.786694 ignition[1034]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 00:11:47.788226 ignition[1034]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 00:11:47.788456 ignition[1034]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 00:11:47.788734 unknown[1034]: wrote ssh authorized keys file for user: core Sep 9 00:11:47.788983 ignition[1034]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 00:11:47.790597 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 00:11:47.790597 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 9 00:11:47.825720 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 00:11:48.074977 systemd-networkd[866]: ens192: Gained IPv6LL Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 00:11:48.383237 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 00:11:48.384994 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 00:11:48.384994 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 00:11:48.384994 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 00:11:48.387523 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 00:11:48.387523 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 00:11:48.387523 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 9 00:11:48.867057 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 00:11:49.308683 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 00:11:49.308683 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 9 00:11:49.309691 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 9 00:11:49.309691 ignition[1034]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Sep 9 00:11:49.310284 ignition[1034]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 00:11:49.310770 ignition[1034]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 00:11:49.310770 ignition[1034]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Sep 9 00:11:49.310770 ignition[1034]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Sep 9 00:11:49.310770 ignition[1034]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 00:11:49.310770 ignition[1034]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 00:11:49.310770 ignition[1034]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Sep 9 00:11:49.310770 ignition[1034]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 00:11:49.335692 ignition[1034]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 00:11:49.338736 ignition[1034]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 00:11:49.338736 ignition[1034]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 00:11:49.338736 ignition[1034]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Sep 9 00:11:49.338736 ignition[1034]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 00:11:49.338736 ignition[1034]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 00:11:49.338736 ignition[1034]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 00:11:49.338736 ignition[1034]: INFO : files: files passed Sep 9 00:11:49.338736 ignition[1034]: INFO : Ignition finished successfully Sep 9 00:11:49.340720 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 00:11:49.341889 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 00:11:49.342942 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 00:11:49.355331 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 00:11:49.355404 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 00:11:49.357840 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 00:11:49.358167 initrd-setup-root-after-ignition[1066]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 00:11:49.358997 initrd-setup-root-after-ignition[1070]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 00:11:49.359986 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 00:11:49.360204 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 00:11:49.360749 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 00:11:49.396113 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 00:11:49.396192 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 00:11:49.396467 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 00:11:49.396578 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 00:11:49.396780 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 00:11:49.397250 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 00:11:49.424454 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 00:11:49.425291 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 00:11:49.435644 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 00:11:49.435817 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 00:11:49.436049 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 00:11:49.436228 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 00:11:49.436307 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 00:11:49.436595 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 00:11:49.436811 systemd[1]: Stopped target basic.target - Basic System. Sep 9 00:11:49.437030 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 00:11:49.437216 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 00:11:49.437416 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 00:11:49.437631 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 00:11:49.437834 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 00:11:49.438081 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 00:11:49.438294 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 00:11:49.438542 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 00:11:49.438763 systemd[1]: Stopped target swap.target - Swaps. Sep 9 00:11:49.438970 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 00:11:49.439040 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 00:11:49.439288 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 00:11:49.439443 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 00:11:49.439617 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 00:11:49.439664 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 00:11:49.439825 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 00:11:49.439909 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 00:11:49.440167 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 00:11:49.440232 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 00:11:49.440473 systemd[1]: Stopped target paths.target - Path Units. Sep 9 00:11:49.440644 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 00:11:49.443882 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 00:11:49.444056 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 00:11:49.444279 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 00:11:49.444452 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 00:11:49.444513 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 00:11:49.444663 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 00:11:49.444714 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 00:11:49.444898 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 00:11:49.444985 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 00:11:49.445234 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 00:11:49.445307 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 00:11:49.445945 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 00:11:49.447337 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 00:11:49.447440 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 00:11:49.447507 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 00:11:49.447664 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 00:11:49.447724 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 00:11:49.450766 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 00:11:49.458910 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 00:11:49.466481 ignition[1091]: INFO : Ignition 2.21.0 Sep 9 00:11:49.466481 ignition[1091]: INFO : Stage: umount Sep 9 00:11:49.466871 ignition[1091]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 00:11:49.466871 ignition[1091]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 9 00:11:49.467169 ignition[1091]: INFO : umount: umount passed Sep 9 00:11:49.467717 ignition[1091]: INFO : Ignition finished successfully Sep 9 00:11:49.468171 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 00:11:49.468244 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 00:11:49.468482 systemd[1]: Stopped target network.target - Network. Sep 9 00:11:49.468583 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 00:11:49.468610 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 00:11:49.468757 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 00:11:49.468780 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 00:11:49.468930 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 00:11:49.468952 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 00:11:49.469102 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 00:11:49.469122 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 00:11:49.469469 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 00:11:49.469612 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 00:11:49.474992 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 00:11:49.475059 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 00:11:49.476539 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 00:11:49.476694 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 00:11:49.476717 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 00:11:49.478122 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 00:11:49.478373 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 00:11:49.478439 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 00:11:49.479239 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 00:11:49.479438 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 00:11:49.479925 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 00:11:49.479950 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 00:11:49.480852 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 00:11:49.482975 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 00:11:49.483006 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 00:11:49.483229 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Sep 9 00:11:49.483250 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 9 00:11:49.483442 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 00:11:49.483466 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 00:11:49.483800 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 00:11:49.483821 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 00:11:49.483981 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 00:11:49.485168 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 00:11:49.489582 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 00:11:49.497167 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 00:11:49.497410 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 00:11:49.497778 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 00:11:49.497813 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 00:11:49.498441 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 00:11:49.498573 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 00:11:49.498808 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 00:11:49.498833 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 00:11:49.499265 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 00:11:49.499290 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 00:11:49.499671 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 00:11:49.499698 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 00:11:49.500541 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 00:11:49.500794 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 00:11:49.500972 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 00:11:49.501935 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 00:11:49.501961 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 00:11:49.502414 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 00:11:49.502558 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 00:11:49.502935 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 00:11:49.503065 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 00:11:49.503324 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 00:11:49.503346 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 00:11:49.504118 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 00:11:49.504984 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 00:11:49.507923 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 00:11:49.507976 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 00:11:50.009306 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 00:11:50.009379 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 00:11:50.009760 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 00:11:50.009894 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 00:11:50.009926 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 00:11:50.010504 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 00:11:50.028718 systemd[1]: Switching root. Sep 9 00:11:50.080827 systemd-journald[243]: Journal stopped Sep 9 00:11:51.637670 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Sep 9 00:11:51.637705 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 00:11:51.637714 kernel: SELinux: policy capability open_perms=1 Sep 9 00:11:51.637720 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 00:11:51.637726 kernel: SELinux: policy capability always_check_network=0 Sep 9 00:11:51.637733 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 00:11:51.637739 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 00:11:51.637745 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 00:11:51.637751 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 00:11:51.637757 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 00:11:51.637763 systemd[1]: Successfully loaded SELinux policy in 32.371ms. Sep 9 00:11:51.637770 kernel: audit: type=1403 audit(1757376710.927:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 00:11:51.637778 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.492ms. Sep 9 00:11:51.637786 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 00:11:51.637793 systemd[1]: Detected virtualization vmware. Sep 9 00:11:51.637800 systemd[1]: Detected architecture x86-64. Sep 9 00:11:51.637807 systemd[1]: Detected first boot. Sep 9 00:11:51.637816 systemd[1]: Initializing machine ID from random generator. Sep 9 00:11:51.637822 zram_generator::config[1134]: No configuration found. Sep 9 00:11:51.637942 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Sep 9 00:11:51.637953 kernel: Guest personality initialized and is active Sep 9 00:11:51.637960 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 00:11:51.637966 kernel: Initialized host personality Sep 9 00:11:51.637975 kernel: NET: Registered PF_VSOCK protocol family Sep 9 00:11:51.637981 systemd[1]: Populated /etc with preset unit settings. Sep 9 00:11:51.637994 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 9 00:11:51.638007 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Sep 9 00:11:51.638014 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 00:11:51.638021 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 00:11:51.638027 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 00:11:51.638036 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 00:11:51.638044 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 00:11:51.638051 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 00:11:51.638061 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 00:11:51.638069 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 00:11:51.638076 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 00:11:51.638083 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 00:11:51.638092 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 00:11:51.638099 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 00:11:51.638106 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 00:11:51.638115 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 00:11:51.638123 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 00:11:51.638130 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 00:11:51.638137 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 00:11:51.638144 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 00:11:51.638152 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 00:11:51.638160 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 00:11:51.638167 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 00:11:51.638174 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 00:11:51.638181 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 00:11:51.638187 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 00:11:51.638194 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 00:11:51.638202 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 00:11:51.638210 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 00:11:51.638217 systemd[1]: Reached target slices.target - Slice Units. Sep 9 00:11:51.638224 systemd[1]: Reached target swap.target - Swaps. Sep 9 00:11:51.638232 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 00:11:51.638240 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 00:11:51.638248 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 00:11:51.638255 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 00:11:51.638263 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 00:11:51.638270 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 00:11:51.638276 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 00:11:51.638283 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 00:11:51.638291 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 00:11:51.638298 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 00:11:51.638306 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 00:11:51.638314 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 00:11:51.638321 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 00:11:51.638328 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 00:11:51.638335 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 00:11:51.638342 systemd[1]: Reached target machines.target - Containers. Sep 9 00:11:51.638349 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 00:11:51.638356 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Sep 9 00:11:51.638365 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 00:11:51.638372 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 00:11:51.638379 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 00:11:51.638387 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 00:11:51.638394 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 00:11:51.638401 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 00:11:51.638408 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 00:11:51.638415 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 00:11:51.638424 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 00:11:51.638431 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 00:11:51.638438 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 00:11:51.638445 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 00:11:51.638452 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 00:11:51.638460 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 00:11:51.638467 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 00:11:51.638474 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 00:11:51.638483 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 00:11:51.638490 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 00:11:51.638497 kernel: fuse: init (API version 7.41) Sep 9 00:11:51.638503 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 00:11:51.638510 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 00:11:51.638518 systemd[1]: Stopped verity-setup.service. Sep 9 00:11:51.638525 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 00:11:51.638532 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 00:11:51.638540 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 00:11:51.638548 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 00:11:51.638555 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 00:11:51.638563 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 00:11:51.638570 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 00:11:51.638577 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 00:11:51.638584 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 00:11:51.638591 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 00:11:51.638598 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 00:11:51.638606 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 00:11:51.638614 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 00:11:51.638621 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 00:11:51.638628 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 00:11:51.638635 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 00:11:51.638642 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 00:11:51.638649 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 00:11:51.638657 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 00:11:51.638664 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 00:11:51.638672 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 00:11:51.638680 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 00:11:51.638689 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 00:11:51.638699 kernel: loop: module loaded Sep 9 00:11:51.638707 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 00:11:51.638715 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 00:11:51.638722 kernel: ACPI: bus type drm_connector registered Sep 9 00:11:51.638731 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 00:11:51.638738 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 00:11:51.638745 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 00:11:51.638770 systemd-journald[1238]: Collecting audit messages is disabled. Sep 9 00:11:51.638789 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 00:11:51.638798 systemd-journald[1238]: Journal started Sep 9 00:11:51.638814 systemd-journald[1238]: Runtime Journal (/run/log/journal/1c6428ffa48d4e12a6434ee46c13410f) is 4.8M, max 38.9M, 34M free. Sep 9 00:11:51.402190 systemd[1]: Queued start job for default target multi-user.target. Sep 9 00:11:51.415153 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 9 00:11:51.415408 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 00:11:51.642343 jq[1204]: true Sep 9 00:11:51.642902 jq[1250]: true Sep 9 00:11:51.643904 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 00:11:51.645875 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 00:11:51.647873 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 00:11:51.651140 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 00:11:51.652951 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 00:11:51.662758 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 00:11:51.662800 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 00:11:51.661583 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 00:11:51.661724 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 00:11:51.661963 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 00:11:51.662073 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 00:11:51.662722 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 00:11:51.663282 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 00:11:51.675978 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 00:11:51.676129 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 00:11:51.697588 systemd-journald[1238]: Time spent on flushing to /var/log/journal/1c6428ffa48d4e12a6434ee46c13410f is 31.810ms for 1763 entries. Sep 9 00:11:51.697588 systemd-journald[1238]: System Journal (/var/log/journal/1c6428ffa48d4e12a6434ee46c13410f) is 8M, max 584.8M, 576.8M free. Sep 9 00:11:51.758557 systemd-journald[1238]: Received client request to flush runtime journal. Sep 9 00:11:51.758596 kernel: loop0: detected capacity change from 0 to 2960 Sep 9 00:11:51.697682 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 00:11:51.722035 ignition[1251]: Ignition 2.21.0 Sep 9 00:11:51.698894 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 00:11:51.722211 ignition[1251]: deleting config from guestinfo properties Sep 9 00:11:51.699507 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 00:11:51.750994 ignition[1251]: Successfully deleted config Sep 9 00:11:51.704635 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 00:11:51.752964 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 00:11:51.753364 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Sep 9 00:11:51.762178 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 00:11:51.769111 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 00:11:51.770048 systemd-tmpfiles[1259]: ACLs are not supported, ignoring. Sep 9 00:11:51.770059 systemd-tmpfiles[1259]: ACLs are not supported, ignoring. Sep 9 00:11:51.772969 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 00:11:51.776938 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 00:11:51.792871 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 00:11:51.810906 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 00:11:51.812934 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 00:11:51.818647 kernel: loop1: detected capacity change from 0 to 113872 Sep 9 00:11:51.830430 systemd-tmpfiles[1305]: ACLs are not supported, ignoring. Sep 9 00:11:51.830442 systemd-tmpfiles[1305]: ACLs are not supported, ignoring. Sep 9 00:11:51.834673 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 00:11:51.851873 kernel: loop2: detected capacity change from 0 to 224512 Sep 9 00:11:51.928872 kernel: loop3: detected capacity change from 0 to 146240 Sep 9 00:11:51.972882 kernel: loop4: detected capacity change from 0 to 2960 Sep 9 00:11:52.011879 kernel: loop5: detected capacity change from 0 to 113872 Sep 9 00:11:52.043902 kernel: loop6: detected capacity change from 0 to 224512 Sep 9 00:11:52.124877 kernel: loop7: detected capacity change from 0 to 146240 Sep 9 00:11:52.164028 (sd-merge)[1312]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Sep 9 00:11:52.164318 (sd-merge)[1312]: Merged extensions into '/usr'. Sep 9 00:11:52.174957 systemd[1]: Reload requested from client PID 1258 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 00:11:52.174967 systemd[1]: Reloading... Sep 9 00:11:52.212896 zram_generator::config[1337]: No configuration found. Sep 9 00:11:52.287796 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 9 00:11:52.297757 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 9 00:11:52.369922 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 00:11:52.370113 systemd[1]: Reloading finished in 194 ms. Sep 9 00:11:52.384439 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 00:11:52.384993 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 00:11:52.394101 systemd[1]: Starting ensure-sysext.service... Sep 9 00:11:52.395243 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 00:11:52.398386 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 00:11:52.412516 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 00:11:52.412537 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 00:11:52.412703 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 00:11:52.412880 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 00:11:52.413403 systemd-tmpfiles[1395]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 00:11:52.413577 systemd-tmpfiles[1395]: ACLs are not supported, ignoring. Sep 9 00:11:52.413611 systemd-tmpfiles[1395]: ACLs are not supported, ignoring. Sep 9 00:11:52.419142 systemd-tmpfiles[1395]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 00:11:52.419150 systemd-tmpfiles[1395]: Skipping /boot Sep 9 00:11:52.420057 systemd[1]: Reload requested from client PID 1394 ('systemctl') (unit ensure-sysext.service)... Sep 9 00:11:52.420068 systemd[1]: Reloading... Sep 9 00:11:52.440130 systemd-tmpfiles[1395]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 00:11:52.440138 systemd-tmpfiles[1395]: Skipping /boot Sep 9 00:11:52.469203 systemd-udevd[1396]: Using default interface naming scheme 'v255'. Sep 9 00:11:52.477877 zram_generator::config[1420]: No configuration found. Sep 9 00:11:52.613950 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 9 00:11:52.623681 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 9 00:11:52.630321 ldconfig[1254]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 00:11:52.684924 systemd[1]: Reloading finished in 264 ms. Sep 9 00:11:52.690722 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 00:11:52.691247 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 00:11:52.691958 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 00:11:52.703293 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 00:11:52.707210 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 00:11:52.710022 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 00:11:52.712505 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 00:11:52.717206 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 00:11:52.723125 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 00:11:52.725937 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 00:11:52.730398 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 00:11:52.737665 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 00:11:52.739116 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 00:11:52.741077 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 00:11:52.741283 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 00:11:52.741350 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 00:11:52.741413 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 00:11:52.742683 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 00:11:52.747604 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 00:11:52.753744 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 00:11:52.754903 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 00:11:52.755309 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 00:11:52.755440 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 00:11:52.759902 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 00:11:52.767767 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 00:11:52.771587 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 00:11:52.776423 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 00:11:52.780042 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 00:11:52.780423 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 00:11:52.780887 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 00:11:52.784888 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 00:11:52.786682 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 00:11:52.787047 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 00:11:52.788875 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 00:11:52.789535 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 00:11:52.790120 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 00:11:52.797502 systemd[1]: Finished ensure-sysext.service. Sep 9 00:11:52.799902 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 00:11:52.800108 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 00:11:52.805072 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 9 00:11:52.806987 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 00:11:52.807929 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 00:11:52.808133 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 00:11:52.808674 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 00:11:52.811087 kernel: ACPI: button: Power Button [PWRF] Sep 9 00:11:52.814711 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 00:11:52.817645 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 00:11:52.818075 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 00:11:52.820312 augenrules[1561]: No rules Sep 9 00:11:52.821669 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 00:11:52.822192 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 00:11:52.838063 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 00:11:52.840575 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 00:11:52.874332 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 00:11:52.878179 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 9 00:11:52.880974 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 00:11:52.932474 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 00:11:52.971711 systemd-networkd[1517]: lo: Link UP Sep 9 00:11:52.971717 systemd-networkd[1517]: lo: Gained carrier Sep 9 00:11:52.979965 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 9 00:11:52.980124 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 9 00:11:52.972582 systemd-networkd[1517]: Enumeration completed Sep 9 00:11:52.972647 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 00:11:52.973945 systemd-networkd[1517]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Sep 9 00:11:52.976333 systemd-networkd[1517]: ens192: Link UP Sep 9 00:11:52.976434 systemd-networkd[1517]: ens192: Gained carrier Sep 9 00:11:52.977171 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 00:11:52.978462 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 00:11:52.979844 systemd-resolved[1518]: Positive Trust Anchors: Sep 9 00:11:52.979850 systemd-resolved[1518]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 00:11:52.980932 systemd-resolved[1518]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 00:11:52.987743 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Sep 9 00:11:52.986983 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 00:11:52.987220 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 00:11:52.989049 systemd-resolved[1518]: Defaulting to hostname 'linux'. Sep 9 00:11:52.990232 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 00:11:52.990408 systemd[1]: Reached target network.target - Network. Sep 9 00:11:52.990505 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 00:11:52.990628 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 00:11:52.990778 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 00:11:52.990920 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 00:11:52.991037 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 00:11:52.991249 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 00:11:52.991409 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 00:11:52.991526 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 00:11:52.991641 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 00:11:52.991658 systemd[1]: Reached target paths.target - Path Units. Sep 9 00:11:52.991751 systemd[1]: Reached target timers.target - Timer Units. Sep 9 00:11:53.001382 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 00:11:53.002599 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 00:11:53.006133 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 00:11:53.007031 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 00:11:53.007311 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 00:11:53.013686 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 00:11:53.014302 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 00:11:53.015790 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 00:11:53.016210 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 00:11:53.017372 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 00:11:53.018193 systemd[1]: Reached target basic.target - Basic System. Sep 9 00:11:53.018532 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 00:11:53.018553 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 00:11:53.019950 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 00:11:53.021829 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 00:11:53.023255 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 00:11:53.025497 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 00:11:53.026890 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 00:11:53.027110 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 00:11:53.030068 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 00:11:53.034304 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 00:11:53.035946 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 00:11:53.039522 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 00:11:53.042000 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 00:13:25.450183 systemd-resolved[1518]: Clock change detected. Flushing caches. Sep 9 00:13:25.450740 systemd-timesyncd[1557]: Contacted time server 166.88.142.52:123 (0.flatcar.pool.ntp.org). Sep 9 00:13:25.450776 systemd-timesyncd[1557]: Initial clock synchronization to Tue 2025-09-09 00:13:25.449661 UTC. Sep 9 00:13:25.454158 jq[1591]: false Sep 9 00:13:25.455483 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 00:13:25.456178 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 00:13:25.456695 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 00:13:25.459184 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 00:13:25.463742 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 00:13:25.467683 extend-filesystems[1592]: Found /dev/sda6 Sep 9 00:13:25.469455 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Sep 9 00:13:25.473417 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Refreshing passwd entry cache Sep 9 00:13:25.473404 oslogin_cache_refresh[1593]: Refreshing passwd entry cache Sep 9 00:13:25.478173 extend-filesystems[1592]: Found /dev/sda9 Sep 9 00:13:25.474797 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 00:13:25.478391 extend-filesystems[1592]: Checking size of /dev/sda9 Sep 9 00:13:25.475105 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 00:13:25.475259 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 00:13:25.480404 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 00:13:25.480735 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 00:13:25.487718 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Failure getting users, quitting Sep 9 00:13:25.487718 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 00:13:25.487718 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Refreshing group entry cache Sep 9 00:13:25.487826 jq[1608]: true Sep 9 00:13:25.484255 oslogin_cache_refresh[1593]: Failure getting users, quitting Sep 9 00:13:25.484270 oslogin_cache_refresh[1593]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 00:13:25.484307 oslogin_cache_refresh[1593]: Refreshing group entry cache Sep 9 00:13:25.493759 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Failure getting groups, quitting Sep 9 00:13:25.493759 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 00:13:25.490761 oslogin_cache_refresh[1593]: Failure getting groups, quitting Sep 9 00:13:25.490769 oslogin_cache_refresh[1593]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 00:13:25.498484 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 00:13:25.499321 extend-filesystems[1592]: Old size kept for /dev/sda9 Sep 9 00:13:25.500069 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 00:13:25.500424 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 00:13:25.500562 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 00:13:25.512529 update_engine[1607]: I20250909 00:13:25.512475 1607 main.cc:92] Flatcar Update Engine starting Sep 9 00:13:25.521058 jq[1628]: true Sep 9 00:13:25.522138 tar[1618]: linux-amd64/LICENSE Sep 9 00:13:25.522138 tar[1618]: linux-amd64/helm Sep 9 00:13:25.526868 dbus-daemon[1589]: [system] SELinux support is enabled Sep 9 00:13:25.527170 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 00:13:25.529739 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 00:13:25.530376 (ntainerd)[1638]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 00:13:25.531283 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 00:13:25.531610 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 00:13:25.531625 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 00:13:25.531774 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 00:13:25.531785 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 00:13:25.537678 systemd[1]: Started update-engine.service - Update Engine. Sep 9 00:13:25.544873 update_engine[1607]: I20250909 00:13:25.544399 1607 update_check_scheduler.cc:74] Next update check in 4m29s Sep 9 00:13:25.554478 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 00:13:25.592551 bash[1666]: Updated "/home/core/.ssh/authorized_keys" Sep 9 00:13:25.594478 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 00:13:25.594981 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 00:13:25.633188 (udev-worker)[1469]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 9 00:13:25.703201 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Sep 9 00:13:25.707248 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Sep 9 00:13:25.718750 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 00:13:25.781060 containerd[1638]: time="2025-09-09T00:13:25Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 00:13:25.782689 containerd[1638]: time="2025-09-09T00:13:25.782668106Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 9 00:13:25.783246 locksmithd[1646]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 00:13:25.822690 containerd[1638]: time="2025-09-09T00:13:25.822655596Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.573µs" Sep 9 00:13:25.822690 containerd[1638]: time="2025-09-09T00:13:25.822682455Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 00:13:25.822690 containerd[1638]: time="2025-09-09T00:13:25.822696497Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.822795077Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.822807440Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.822823179Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.822860112Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.822867685Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.822987057Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.822996713Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.823003384Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.823007905Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823140 containerd[1638]: time="2025-09-09T00:13:25.823058717Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823298 containerd[1638]: time="2025-09-09T00:13:25.823190632Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823298 containerd[1638]: time="2025-09-09T00:13:25.823208588Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 00:13:25.823298 containerd[1638]: time="2025-09-09T00:13:25.823215485Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 00:13:25.823298 containerd[1638]: time="2025-09-09T00:13:25.823232227Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 00:13:25.823376 containerd[1638]: time="2025-09-09T00:13:25.823363132Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 00:13:25.823413 containerd[1638]: time="2025-09-09T00:13:25.823401167Z" level=info msg="metadata content store policy set" policy=shared Sep 9 00:13:25.837666 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Sep 9 00:13:25.838179 containerd[1638]: time="2025-09-09T00:13:25.837975723Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 00:13:25.838179 containerd[1638]: time="2025-09-09T00:13:25.838012388Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 00:13:25.838179 containerd[1638]: time="2025-09-09T00:13:25.838024039Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 00:13:25.838179 containerd[1638]: time="2025-09-09T00:13:25.838167845Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 00:13:25.838179 containerd[1638]: time="2025-09-09T00:13:25.838178753Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 00:13:25.838265 containerd[1638]: time="2025-09-09T00:13:25.838185500Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 00:13:25.838265 containerd[1638]: time="2025-09-09T00:13:25.838195692Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 00:13:25.838265 containerd[1638]: time="2025-09-09T00:13:25.838203317Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 00:13:25.838265 containerd[1638]: time="2025-09-09T00:13:25.838209552Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 00:13:25.838265 containerd[1638]: time="2025-09-09T00:13:25.838214986Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 00:13:25.838265 containerd[1638]: time="2025-09-09T00:13:25.838220159Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 00:13:25.838265 containerd[1638]: time="2025-09-09T00:13:25.838227746Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 00:13:25.838356 containerd[1638]: time="2025-09-09T00:13:25.838320553Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 00:13:25.838356 containerd[1638]: time="2025-09-09T00:13:25.838336537Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 00:13:25.838356 containerd[1638]: time="2025-09-09T00:13:25.838346237Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 00:13:25.838356 containerd[1638]: time="2025-09-09T00:13:25.838352351Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 00:13:25.838408 containerd[1638]: time="2025-09-09T00:13:25.838357920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 00:13:25.838408 containerd[1638]: time="2025-09-09T00:13:25.838363435Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 00:13:25.838408 containerd[1638]: time="2025-09-09T00:13:25.838369672Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 00:13:25.838408 containerd[1638]: time="2025-09-09T00:13:25.838375288Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 00:13:25.838408 containerd[1638]: time="2025-09-09T00:13:25.838381670Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 00:13:25.838408 containerd[1638]: time="2025-09-09T00:13:25.838387437Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 00:13:25.838408 containerd[1638]: time="2025-09-09T00:13:25.838397176Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 00:13:25.838504 containerd[1638]: time="2025-09-09T00:13:25.838436194Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 00:13:25.838504 containerd[1638]: time="2025-09-09T00:13:25.838444266Z" level=info msg="Start snapshots syncer" Sep 9 00:13:25.838504 containerd[1638]: time="2025-09-09T00:13:25.838470422Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 00:13:25.839140 containerd[1638]: time="2025-09-09T00:13:25.838659226Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 00:13:25.839140 containerd[1638]: time="2025-09-09T00:13:25.838714074Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838759568Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838815130Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838831740Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838838398Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838845456Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838852248Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838857821Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838863956Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838877174Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838883984Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838890686Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838905626Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838913909Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 00:13:25.839237 containerd[1638]: time="2025-09-09T00:13:25.838919041Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838924299Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838929193Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838934334Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838939641Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838948841Z" level=info msg="runtime interface created" Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838951793Z" level=info msg="created NRI interface" Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838956943Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838963101Z" level=info msg="Connect containerd service" Sep 9 00:13:25.839427 containerd[1638]: time="2025-09-09T00:13:25.838982039Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 00:13:25.843652 containerd[1638]: time="2025-09-09T00:13:25.843604182Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 00:13:25.847882 unknown[1674]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Sep 9 00:13:25.858506 unknown[1674]: Core dump limit set to -1 Sep 9 00:13:25.918166 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 00:13:25.980005 systemd-logind[1603]: Watching system buttons on /dev/input/event2 (Power Button) Sep 9 00:13:25.980024 systemd-logind[1603]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 00:13:25.982206 systemd-logind[1603]: New seat seat0. Sep 9 00:13:25.984660 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 00:13:26.058096 containerd[1638]: time="2025-09-09T00:13:26.058014735Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 00:13:26.058096 containerd[1638]: time="2025-09-09T00:13:26.058089099Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 00:13:26.062746 containerd[1638]: time="2025-09-09T00:13:26.062724643Z" level=info msg="Start subscribing containerd event" Sep 9 00:13:26.062800 containerd[1638]: time="2025-09-09T00:13:26.062758694Z" level=info msg="Start recovering state" Sep 9 00:13:26.063065 containerd[1638]: time="2025-09-09T00:13:26.063053930Z" level=info msg="Start event monitor" Sep 9 00:13:26.063094 containerd[1638]: time="2025-09-09T00:13:26.063068620Z" level=info msg="Start cni network conf syncer for default" Sep 9 00:13:26.063094 containerd[1638]: time="2025-09-09T00:13:26.063074383Z" level=info msg="Start streaming server" Sep 9 00:13:26.063141 containerd[1638]: time="2025-09-09T00:13:26.063095948Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 00:13:26.063141 containerd[1638]: time="2025-09-09T00:13:26.063100942Z" level=info msg="runtime interface starting up..." Sep 9 00:13:26.063141 containerd[1638]: time="2025-09-09T00:13:26.063104519Z" level=info msg="starting plugins..." Sep 9 00:13:26.063185 containerd[1638]: time="2025-09-09T00:13:26.063150720Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 00:13:26.064229 containerd[1638]: time="2025-09-09T00:13:26.064013650Z" level=info msg="containerd successfully booted in 0.283205s" Sep 9 00:13:26.064090 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 00:13:26.075689 sshd_keygen[1640]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 00:13:26.092970 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 00:13:26.096432 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 00:13:26.111909 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 00:13:26.113582 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 00:13:26.116307 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 00:13:26.140159 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 00:13:26.142568 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 00:13:26.144910 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 00:13:26.145129 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 00:13:26.267983 tar[1618]: linux-amd64/README.md Sep 9 00:13:26.280103 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 00:13:26.623253 systemd-networkd[1517]: ens192: Gained IPv6LL Sep 9 00:13:26.624467 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 00:13:26.625169 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 00:13:26.626254 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Sep 9 00:13:26.628832 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:13:26.632394 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 00:13:26.647607 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 00:13:26.667517 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 00:13:26.667669 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Sep 9 00:13:26.668367 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 00:13:27.433411 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:13:27.433762 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 00:13:27.434585 systemd[1]: Startup finished in 2.687s (kernel) + 6.338s (initrd) + 4.133s (userspace) = 13.159s. Sep 9 00:13:27.440549 (kubelet)[1808]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 00:13:27.474175 login[1744]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 00:13:27.474292 login[1743]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 00:13:27.479701 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 00:13:27.480753 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 00:13:27.487582 systemd-logind[1603]: New session 2 of user core. Sep 9 00:13:27.492921 systemd-logind[1603]: New session 1 of user core. Sep 9 00:13:27.497870 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 00:13:27.499366 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 00:13:27.507449 (systemd)[1815]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 00:13:27.509159 systemd-logind[1603]: New session c1 of user core. Sep 9 00:13:27.599083 systemd[1815]: Queued start job for default target default.target. Sep 9 00:13:27.603916 systemd[1815]: Created slice app.slice - User Application Slice. Sep 9 00:13:27.603936 systemd[1815]: Reached target paths.target - Paths. Sep 9 00:13:27.603963 systemd[1815]: Reached target timers.target - Timers. Sep 9 00:13:27.607169 systemd[1815]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 00:13:27.611630 systemd[1815]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 00:13:27.611663 systemd[1815]: Reached target sockets.target - Sockets. Sep 9 00:13:27.611692 systemd[1815]: Reached target basic.target - Basic System. Sep 9 00:13:27.611713 systemd[1815]: Reached target default.target - Main User Target. Sep 9 00:13:27.611730 systemd[1815]: Startup finished in 97ms. Sep 9 00:13:27.612209 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 00:13:27.619313 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 00:13:27.620724 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 00:13:27.969465 kubelet[1808]: E0909 00:13:27.969432 1808 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 00:13:27.970972 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 00:13:27.971060 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 00:13:27.971291 systemd[1]: kubelet.service: Consumed 598ms CPU time, 262.8M memory peak. Sep 9 00:13:38.221649 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 00:13:38.223021 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:13:38.574674 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:13:38.576995 (kubelet)[1859]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 00:13:38.631552 kubelet[1859]: E0909 00:13:38.631520 1859 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 00:13:38.634211 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 00:13:38.634316 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 00:13:38.634730 systemd[1]: kubelet.service: Consumed 112ms CPU time, 110.7M memory peak. Sep 9 00:13:48.884854 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 00:13:48.886297 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:13:49.132376 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:13:49.134826 (kubelet)[1874]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 00:13:49.179328 kubelet[1874]: E0909 00:13:49.179266 1874 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 00:13:49.180751 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 00:13:49.180888 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 00:13:49.181247 systemd[1]: kubelet.service: Consumed 103ms CPU time, 110.8M memory peak. Sep 9 00:13:56.078055 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 00:13:56.079758 systemd[1]: Started sshd@0-139.178.70.104:22-139.178.68.195:60312.service - OpenSSH per-connection server daemon (139.178.68.195:60312). Sep 9 00:13:56.133281 sshd[1882]: Accepted publickey for core from 139.178.68.195 port 60312 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:13:56.134086 sshd-session[1882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:13:56.136611 systemd-logind[1603]: New session 3 of user core. Sep 9 00:13:56.147438 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 00:13:56.200904 systemd[1]: Started sshd@1-139.178.70.104:22-139.178.68.195:60314.service - OpenSSH per-connection server daemon (139.178.68.195:60314). Sep 9 00:13:56.242385 sshd[1887]: Accepted publickey for core from 139.178.68.195 port 60314 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:13:56.243403 sshd-session[1887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:13:56.246111 systemd-logind[1603]: New session 4 of user core. Sep 9 00:13:56.254461 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 00:13:56.303105 sshd[1889]: Connection closed by 139.178.68.195 port 60314 Sep 9 00:13:56.303805 sshd-session[1887]: pam_unix(sshd:session): session closed for user core Sep 9 00:13:56.308962 systemd[1]: sshd@1-139.178.70.104:22-139.178.68.195:60314.service: Deactivated successfully. Sep 9 00:13:56.309808 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 00:13:56.310403 systemd-logind[1603]: Session 4 logged out. Waiting for processes to exit. Sep 9 00:13:56.311771 systemd-logind[1603]: Removed session 4. Sep 9 00:13:56.312509 systemd[1]: Started sshd@2-139.178.70.104:22-139.178.68.195:60320.service - OpenSSH per-connection server daemon (139.178.68.195:60320). Sep 9 00:13:56.356827 sshd[1895]: Accepted publickey for core from 139.178.68.195 port 60320 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:13:56.357858 sshd-session[1895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:13:56.361255 systemd-logind[1603]: New session 5 of user core. Sep 9 00:13:56.373400 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 00:13:56.419586 sshd[1897]: Connection closed by 139.178.68.195 port 60320 Sep 9 00:13:56.419244 sshd-session[1895]: pam_unix(sshd:session): session closed for user core Sep 9 00:13:56.428254 systemd[1]: sshd@2-139.178.70.104:22-139.178.68.195:60320.service: Deactivated successfully. Sep 9 00:13:56.429095 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 00:13:56.429542 systemd-logind[1603]: Session 5 logged out. Waiting for processes to exit. Sep 9 00:13:56.430829 systemd[1]: Started sshd@3-139.178.70.104:22-139.178.68.195:60332.service - OpenSSH per-connection server daemon (139.178.68.195:60332). Sep 9 00:13:56.432310 systemd-logind[1603]: Removed session 5. Sep 9 00:13:56.467795 sshd[1903]: Accepted publickey for core from 139.178.68.195 port 60332 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:13:56.468568 sshd-session[1903]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:13:56.471985 systemd-logind[1603]: New session 6 of user core. Sep 9 00:13:56.477233 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 00:13:56.524930 sshd[1905]: Connection closed by 139.178.68.195 port 60332 Sep 9 00:13:56.525222 sshd-session[1903]: pam_unix(sshd:session): session closed for user core Sep 9 00:13:56.535825 systemd[1]: sshd@3-139.178.70.104:22-139.178.68.195:60332.service: Deactivated successfully. Sep 9 00:13:56.536880 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 00:13:56.537629 systemd-logind[1603]: Session 6 logged out. Waiting for processes to exit. Sep 9 00:13:56.539304 systemd[1]: Started sshd@4-139.178.70.104:22-139.178.68.195:60346.service - OpenSSH per-connection server daemon (139.178.68.195:60346). Sep 9 00:13:56.540142 systemd-logind[1603]: Removed session 6. Sep 9 00:13:56.577372 sshd[1911]: Accepted publickey for core from 139.178.68.195 port 60346 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:13:56.578182 sshd-session[1911]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:13:56.580880 systemd-logind[1603]: New session 7 of user core. Sep 9 00:13:56.591237 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 00:13:56.652239 sudo[1914]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 00:13:56.652402 sudo[1914]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 00:13:56.664556 sudo[1914]: pam_unix(sudo:session): session closed for user root Sep 9 00:13:56.665358 sshd[1913]: Connection closed by 139.178.68.195 port 60346 Sep 9 00:13:56.665716 sshd-session[1911]: pam_unix(sshd:session): session closed for user core Sep 9 00:13:56.674261 systemd[1]: sshd@4-139.178.70.104:22-139.178.68.195:60346.service: Deactivated successfully. Sep 9 00:13:56.675100 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 00:13:56.675919 systemd-logind[1603]: Session 7 logged out. Waiting for processes to exit. Sep 9 00:13:56.676962 systemd[1]: Started sshd@5-139.178.70.104:22-139.178.68.195:60350.service - OpenSSH per-connection server daemon (139.178.68.195:60350). Sep 9 00:13:56.678747 systemd-logind[1603]: Removed session 7. Sep 9 00:13:56.720178 sshd[1920]: Accepted publickey for core from 139.178.68.195 port 60350 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:13:56.720979 sshd-session[1920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:13:56.724029 systemd-logind[1603]: New session 8 of user core. Sep 9 00:13:56.730217 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 00:13:56.778999 sudo[1924]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 00:13:56.779167 sudo[1924]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 00:13:56.781486 sudo[1924]: pam_unix(sudo:session): session closed for user root Sep 9 00:13:56.784517 sudo[1923]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 00:13:56.784852 sudo[1923]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 00:13:56.791009 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 00:13:56.817467 augenrules[1946]: No rules Sep 9 00:13:56.818078 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 00:13:56.818264 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 00:13:56.818993 sudo[1923]: pam_unix(sudo:session): session closed for user root Sep 9 00:13:56.820468 sshd[1922]: Connection closed by 139.178.68.195 port 60350 Sep 9 00:13:56.820220 sshd-session[1920]: pam_unix(sshd:session): session closed for user core Sep 9 00:13:56.830550 systemd[1]: sshd@5-139.178.70.104:22-139.178.68.195:60350.service: Deactivated successfully. Sep 9 00:13:56.831948 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 00:13:56.833032 systemd-logind[1603]: Session 8 logged out. Waiting for processes to exit. Sep 9 00:13:56.834750 systemd[1]: Started sshd@6-139.178.70.104:22-139.178.68.195:60356.service - OpenSSH per-connection server daemon (139.178.68.195:60356). Sep 9 00:13:56.835649 systemd-logind[1603]: Removed session 8. Sep 9 00:13:56.871465 sshd[1955]: Accepted publickey for core from 139.178.68.195 port 60356 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:13:56.872455 sshd-session[1955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:13:56.875587 systemd-logind[1603]: New session 9 of user core. Sep 9 00:13:56.886243 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 00:13:56.933727 sudo[1958]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 00:13:56.933879 sudo[1958]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 00:13:57.238249 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 00:13:57.246525 (dockerd)[1976]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 00:13:57.468581 dockerd[1976]: time="2025-09-09T00:13:57.468548092Z" level=info msg="Starting up" Sep 9 00:13:57.469576 dockerd[1976]: time="2025-09-09T00:13:57.469560903Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 00:13:57.483497 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport551623852-merged.mount: Deactivated successfully. Sep 9 00:13:57.501006 dockerd[1976]: time="2025-09-09T00:13:57.500767617Z" level=info msg="Loading containers: start." Sep 9 00:13:57.507135 kernel: Initializing XFRM netlink socket Sep 9 00:13:57.667584 systemd-networkd[1517]: docker0: Link UP Sep 9 00:13:57.668585 dockerd[1976]: time="2025-09-09T00:13:57.668563606Z" level=info msg="Loading containers: done." Sep 9 00:13:57.676732 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1393701769-merged.mount: Deactivated successfully. Sep 9 00:13:57.677712 dockerd[1976]: time="2025-09-09T00:13:57.677689606Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 00:13:57.677762 dockerd[1976]: time="2025-09-09T00:13:57.677750879Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 9 00:13:57.677817 dockerd[1976]: time="2025-09-09T00:13:57.677805939Z" level=info msg="Initializing buildkit" Sep 9 00:13:57.687217 dockerd[1976]: time="2025-09-09T00:13:57.687192537Z" level=info msg="Completed buildkit initialization" Sep 9 00:13:57.691721 dockerd[1976]: time="2025-09-09T00:13:57.691699822Z" level=info msg="Daemon has completed initialization" Sep 9 00:13:57.692136 dockerd[1976]: time="2025-09-09T00:13:57.691790381Z" level=info msg="API listen on /run/docker.sock" Sep 9 00:13:57.691814 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 00:13:58.415130 containerd[1638]: time="2025-09-09T00:13:58.415082080Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 00:13:59.070125 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount843636987.mount: Deactivated successfully. Sep 9 00:13:59.431427 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 00:13:59.435308 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:13:59.765454 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:13:59.768001 (kubelet)[2235]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 00:13:59.792521 kubelet[2235]: E0909 00:13:59.792497 2235 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 00:13:59.793635 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 00:13:59.793719 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 00:13:59.794058 systemd[1]: kubelet.service: Consumed 94ms CPU time, 108.3M memory peak. Sep 9 00:14:00.342537 containerd[1638]: time="2025-09-09T00:14:00.342495599Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:00.346300 containerd[1638]: time="2025-09-09T00:14:00.346270178Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28800687" Sep 9 00:14:00.354456 containerd[1638]: time="2025-09-09T00:14:00.354426693Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:00.398468 containerd[1638]: time="2025-09-09T00:14:00.398428363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:00.399485 containerd[1638]: time="2025-09-09T00:14:00.399347304Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 1.984220917s" Sep 9 00:14:00.399485 containerd[1638]: time="2025-09-09T00:14:00.399374785Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Sep 9 00:14:00.400182 containerd[1638]: time="2025-09-09T00:14:00.400051272Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 00:14:02.205463 containerd[1638]: time="2025-09-09T00:14:02.205112685Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:02.306994 containerd[1638]: time="2025-09-09T00:14:02.306962087Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24784128" Sep 9 00:14:02.601210 containerd[1638]: time="2025-09-09T00:14:02.600901122Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:02.613639 containerd[1638]: time="2025-09-09T00:14:02.613615466Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:02.614314 containerd[1638]: time="2025-09-09T00:14:02.614291740Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 2.214127132s" Sep 9 00:14:02.614357 containerd[1638]: time="2025-09-09T00:14:02.614313657Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Sep 9 00:14:02.614607 containerd[1638]: time="2025-09-09T00:14:02.614591954Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 00:14:03.983967 containerd[1638]: time="2025-09-09T00:14:03.983499106Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:03.984516 containerd[1638]: time="2025-09-09T00:14:03.984492478Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19175036" Sep 9 00:14:03.984781 containerd[1638]: time="2025-09-09T00:14:03.984769814Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:03.986438 containerd[1638]: time="2025-09-09T00:14:03.986426174Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:03.986806 containerd[1638]: time="2025-09-09T00:14:03.986728324Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 1.371686333s" Sep 9 00:14:03.987188 containerd[1638]: time="2025-09-09T00:14:03.987178651Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Sep 9 00:14:03.987490 containerd[1638]: time="2025-09-09T00:14:03.987476864Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 00:14:05.386892 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4068830217.mount: Deactivated successfully. Sep 9 00:14:05.806367 containerd[1638]: time="2025-09-09T00:14:05.806289575Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:05.811779 containerd[1638]: time="2025-09-09T00:14:05.811745888Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30897170" Sep 9 00:14:05.820035 containerd[1638]: time="2025-09-09T00:14:05.820007126Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:05.824625 containerd[1638]: time="2025-09-09T00:14:05.824604953Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:05.824888 containerd[1638]: time="2025-09-09T00:14:05.824804956Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 1.837262393s" Sep 9 00:14:05.824888 containerd[1638]: time="2025-09-09T00:14:05.824821831Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Sep 9 00:14:05.825091 containerd[1638]: time="2025-09-09T00:14:05.825071636Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 00:14:06.428332 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2867332433.mount: Deactivated successfully. Sep 9 00:14:07.366967 containerd[1638]: time="2025-09-09T00:14:07.366304613Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:07.371337 containerd[1638]: time="2025-09-09T00:14:07.371321337Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 9 00:14:07.379729 containerd[1638]: time="2025-09-09T00:14:07.379698476Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:07.385862 containerd[1638]: time="2025-09-09T00:14:07.385841835Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:07.386787 containerd[1638]: time="2025-09-09T00:14:07.386751771Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.56161s" Sep 9 00:14:07.386832 containerd[1638]: time="2025-09-09T00:14:07.386787958Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 00:14:07.387199 containerd[1638]: time="2025-09-09T00:14:07.387181199Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 00:14:07.975206 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2838054444.mount: Deactivated successfully. Sep 9 00:14:07.987689 containerd[1638]: time="2025-09-09T00:14:07.987660371Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 00:14:07.988055 containerd[1638]: time="2025-09-09T00:14:07.988034199Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 9 00:14:07.989065 containerd[1638]: time="2025-09-09T00:14:07.988314716Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 00:14:07.989438 containerd[1638]: time="2025-09-09T00:14:07.989421608Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 00:14:07.990017 containerd[1638]: time="2025-09-09T00:14:07.989998325Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 602.797369ms" Sep 9 00:14:07.990077 containerd[1638]: time="2025-09-09T00:14:07.990065704Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 00:14:07.990382 containerd[1638]: time="2025-09-09T00:14:07.990365041Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 00:14:08.552044 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2626321267.mount: Deactivated successfully. Sep 9 00:14:10.021081 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 00:14:10.022232 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:14:10.211429 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:14:10.214298 (kubelet)[2375]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 00:14:10.407370 kubelet[2375]: E0909 00:14:10.407290 2375 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 00:14:10.408825 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 00:14:10.408913 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 00:14:10.409303 systemd[1]: kubelet.service: Consumed 104ms CPU time, 110.3M memory peak. Sep 9 00:14:11.289148 update_engine[1607]: I20250909 00:14:11.288921 1607 update_attempter.cc:509] Updating boot flags... Sep 9 00:14:13.667195 containerd[1638]: time="2025-09-09T00:14:13.667008162Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:13.692792 containerd[1638]: time="2025-09-09T00:14:13.692765687Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 9 00:14:13.717226 containerd[1638]: time="2025-09-09T00:14:13.717186420Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:13.746047 containerd[1638]: time="2025-09-09T00:14:13.745987954Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:13.754527 containerd[1638]: time="2025-09-09T00:14:13.746861832Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 5.756479138s" Sep 9 00:14:13.754527 containerd[1638]: time="2025-09-09T00:14:13.746889956Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 9 00:14:15.718535 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:14:15.718835 systemd[1]: kubelet.service: Consumed 104ms CPU time, 110.3M memory peak. Sep 9 00:14:15.720392 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:14:15.738475 systemd[1]: Reload requested from client PID 2435 ('systemctl') (unit session-9.scope)... Sep 9 00:14:15.738485 systemd[1]: Reloading... Sep 9 00:14:15.808138 zram_generator::config[2482]: No configuration found. Sep 9 00:14:15.868745 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 9 00:14:15.880433 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 9 00:14:15.951652 systemd[1]: Reloading finished in 212 ms. Sep 9 00:14:15.977398 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 00:14:15.977474 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 00:14:15.977752 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:14:15.979216 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:14:16.543708 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:14:16.549295 (kubelet)[2546]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 00:14:16.617079 kubelet[2546]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 00:14:16.617604 kubelet[2546]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 00:14:16.617641 kubelet[2546]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 00:14:16.617726 kubelet[2546]: I0909 00:14:16.617710 2546 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 00:14:16.885939 kubelet[2546]: I0909 00:14:16.885695 2546 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 00:14:16.885939 kubelet[2546]: I0909 00:14:16.885718 2546 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 00:14:16.885939 kubelet[2546]: I0909 00:14:16.885879 2546 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 00:14:16.925379 kubelet[2546]: E0909 00:14:16.925345 2546 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.104:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:16.928827 kubelet[2546]: I0909 00:14:16.928716 2546 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 00:14:16.937753 kubelet[2546]: I0909 00:14:16.937741 2546 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 00:14:16.942589 kubelet[2546]: I0909 00:14:16.942541 2546 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 00:14:16.944590 kubelet[2546]: I0909 00:14:16.944381 2546 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 00:14:16.944590 kubelet[2546]: I0909 00:14:16.944417 2546 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 00:14:16.946085 kubelet[2546]: I0909 00:14:16.946075 2546 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 00:14:16.946162 kubelet[2546]: I0909 00:14:16.946155 2546 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 00:14:16.947271 kubelet[2546]: I0909 00:14:16.947263 2546 state_mem.go:36] "Initialized new in-memory state store" Sep 9 00:14:16.952231 kubelet[2546]: I0909 00:14:16.952214 2546 kubelet.go:446] "Attempting to sync node with API server" Sep 9 00:14:16.952327 kubelet[2546]: I0909 00:14:16.952319 2546 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 00:14:16.954386 kubelet[2546]: I0909 00:14:16.954375 2546 kubelet.go:352] "Adding apiserver pod source" Sep 9 00:14:16.954492 kubelet[2546]: I0909 00:14:16.954442 2546 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 00:14:16.960459 kubelet[2546]: W0909 00:14:16.960408 2546 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 9 00:14:16.960856 kubelet[2546]: E0909 00:14:16.960446 2546 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:16.960895 kubelet[2546]: I0909 00:14:16.960878 2546 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 9 00:14:16.964480 kubelet[2546]: I0909 00:14:16.964461 2546 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 00:14:16.964540 kubelet[2546]: W0909 00:14:16.964514 2546 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 00:14:16.976795 kubelet[2546]: I0909 00:14:16.976777 2546 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 00:14:16.976900 kubelet[2546]: I0909 00:14:16.976895 2546 server.go:1287] "Started kubelet" Sep 9 00:14:16.977798 kubelet[2546]: W0909 00:14:16.977761 2546 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 9 00:14:17.014210 kubelet[2546]: E0909 00:14:17.014184 2546 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:17.014210 kubelet[2546]: I0909 00:14:17.012577 2546 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 00:14:17.017660 kubelet[2546]: I0909 00:14:17.017406 2546 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 00:14:17.017660 kubelet[2546]: I0909 00:14:17.017598 2546 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 00:14:17.020452 kubelet[2546]: I0909 00:14:16.978001 2546 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 00:14:17.021163 kubelet[2546]: I0909 00:14:17.021148 2546 server.go:479] "Adding debug handlers to kubelet server" Sep 9 00:14:17.021685 kubelet[2546]: I0909 00:14:17.012638 2546 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 00:14:17.021740 kubelet[2546]: I0909 00:14:17.021729 2546 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 00:14:17.021877 kubelet[2546]: E0909 00:14:17.021863 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:17.025084 kubelet[2546]: I0909 00:14:17.025067 2546 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 00:14:17.025150 kubelet[2546]: I0909 00:14:17.025139 2546 reconciler.go:26] "Reconciler: start to sync state" Sep 9 00:14:17.030820 kubelet[2546]: E0909 00:14:17.030724 2546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="200ms" Sep 9 00:14:17.045877 kubelet[2546]: E0909 00:14:17.030924 2546 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.104:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.104:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.186374f0c09975cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 00:14:16.976881103 +0000 UTC m=+0.425508427,LastTimestamp:2025-09-09 00:14:16.976881103 +0000 UTC m=+0.425508427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 00:14:17.046699 kubelet[2546]: W0909 00:14:17.045973 2546 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 9 00:14:17.046699 kubelet[2546]: E0909 00:14:17.046008 2546 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:17.046699 kubelet[2546]: I0909 00:14:17.046236 2546 factory.go:221] Registration of the systemd container factory successfully Sep 9 00:14:17.046699 kubelet[2546]: I0909 00:14:17.046285 2546 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 00:14:17.049875 kubelet[2546]: I0909 00:14:17.049856 2546 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 00:14:17.050644 kubelet[2546]: I0909 00:14:17.050630 2546 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 00:14:17.050677 kubelet[2546]: I0909 00:14:17.050659 2546 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 00:14:17.050677 kubelet[2546]: I0909 00:14:17.050673 2546 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 00:14:17.050677 kubelet[2546]: I0909 00:14:17.050677 2546 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 00:14:17.050804 kubelet[2546]: E0909 00:14:17.050701 2546 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 00:14:17.060876 kubelet[2546]: I0909 00:14:17.060855 2546 factory.go:221] Registration of the containerd container factory successfully Sep 9 00:14:17.076005 kubelet[2546]: W0909 00:14:17.075921 2546 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 9 00:14:17.076005 kubelet[2546]: E0909 00:14:17.075965 2546 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:17.098597 kubelet[2546]: I0909 00:14:17.098554 2546 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 00:14:17.098597 kubelet[2546]: I0909 00:14:17.098566 2546 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 00:14:17.098597 kubelet[2546]: I0909 00:14:17.098576 2546 state_mem.go:36] "Initialized new in-memory state store" Sep 9 00:14:17.122937 kubelet[2546]: E0909 00:14:17.122912 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:17.147160 kubelet[2546]: I0909 00:14:17.146502 2546 policy_none.go:49] "None policy: Start" Sep 9 00:14:17.147160 kubelet[2546]: I0909 00:14:17.146531 2546 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 00:14:17.147160 kubelet[2546]: I0909 00:14:17.146542 2546 state_mem.go:35] "Initializing new in-memory state store" Sep 9 00:14:17.151210 kubelet[2546]: E0909 00:14:17.151194 2546 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 00:14:17.170668 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 00:14:17.178205 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 00:14:17.180596 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 00:14:17.322455 kubelet[2546]: I0909 00:14:17.185558 2546 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 00:14:17.322455 kubelet[2546]: I0909 00:14:17.185667 2546 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 00:14:17.322455 kubelet[2546]: I0909 00:14:17.185673 2546 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 00:14:17.322455 kubelet[2546]: I0909 00:14:17.185990 2546 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 00:14:17.322455 kubelet[2546]: E0909 00:14:17.186650 2546 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 00:14:17.322455 kubelet[2546]: E0909 00:14:17.186668 2546 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 00:14:17.322455 kubelet[2546]: E0909 00:14:17.231640 2546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="400ms" Sep 9 00:14:17.322455 kubelet[2546]: I0909 00:14:17.286667 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 00:14:17.322455 kubelet[2546]: E0909 00:14:17.286888 2546 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Sep 9 00:14:17.370811 systemd[1]: Created slice kubepods-burstable-pod1b664c63a6b959c5a3525fce1b6e62fd.slice - libcontainer container kubepods-burstable-pod1b664c63a6b959c5a3525fce1b6e62fd.slice. Sep 9 00:14:17.387226 kubelet[2546]: E0909 00:14:17.387206 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:17.391112 systemd[1]: Created slice kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice - libcontainer container kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice. Sep 9 00:14:17.402909 kubelet[2546]: E0909 00:14:17.402405 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:17.406143 systemd[1]: Created slice kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice - libcontainer container kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice. Sep 9 00:14:17.408059 kubelet[2546]: E0909 00:14:17.408037 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:17.427423 kubelet[2546]: I0909 00:14:17.427394 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:17.427423 kubelet[2546]: I0909 00:14:17.427420 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:17.427423 kubelet[2546]: I0909 00:14:17.427435 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:17.427565 kubelet[2546]: I0909 00:14:17.427446 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:17.427565 kubelet[2546]: I0909 00:14:17.427455 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:17.427565 kubelet[2546]: I0909 00:14:17.427465 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 9 00:14:17.427565 kubelet[2546]: I0909 00:14:17.427472 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1b664c63a6b959c5a3525fce1b6e62fd-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"1b664c63a6b959c5a3525fce1b6e62fd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:17.427565 kubelet[2546]: I0909 00:14:17.427480 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1b664c63a6b959c5a3525fce1b6e62fd-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"1b664c63a6b959c5a3525fce1b6e62fd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:17.427657 kubelet[2546]: I0909 00:14:17.427490 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1b664c63a6b959c5a3525fce1b6e62fd-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"1b664c63a6b959c5a3525fce1b6e62fd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:17.488732 kubelet[2546]: I0909 00:14:17.488708 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 00:14:17.496215 kubelet[2546]: E0909 00:14:17.488959 2546 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Sep 9 00:14:17.632424 kubelet[2546]: E0909 00:14:17.632390 2546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="800ms" Sep 9 00:14:17.688041 containerd[1638]: time="2025-09-09T00:14:17.687953714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:1b664c63a6b959c5a3525fce1b6e62fd,Namespace:kube-system,Attempt:0,}" Sep 9 00:14:17.703757 containerd[1638]: time="2025-09-09T00:14:17.703729596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,}" Sep 9 00:14:17.717557 containerd[1638]: time="2025-09-09T00:14:17.717330104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,}" Sep 9 00:14:17.781609 containerd[1638]: time="2025-09-09T00:14:17.781559819Z" level=info msg="connecting to shim 0cf51971a222a16020dd9e3066cd0f926bcc63a5f090e66e337f65aa4393bc28" address="unix:///run/containerd/s/c712b8d946b1375b006e1bcfb77093c933f20b3dda7c98ddf98431fbd78b886b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:14:17.783004 containerd[1638]: time="2025-09-09T00:14:17.782649415Z" level=info msg="connecting to shim 862b3fbadd0f25b4d22ca991756e4570b59a2929570bb9ecbf8b28d1e84360a6" address="unix:///run/containerd/s/40303e7bb44c60f7db4224536cc418a580884f004bfe58abaf0932e070211252" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:14:17.784855 containerd[1638]: time="2025-09-09T00:14:17.784833112Z" level=info msg="connecting to shim 7fb35fd4e114589274c1fd9155fd34e937642ff0275d4e9801c960529f91d873" address="unix:///run/containerd/s/18e03fea5bce8354722b112c137d0689ce2c80ab967288a872093e234238f90a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:14:17.891409 kubelet[2546]: I0909 00:14:17.891395 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 00:14:17.891753 kubelet[2546]: E0909 00:14:17.891740 2546 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Sep 9 00:14:17.910215 systemd[1]: Started cri-containerd-0cf51971a222a16020dd9e3066cd0f926bcc63a5f090e66e337f65aa4393bc28.scope - libcontainer container 0cf51971a222a16020dd9e3066cd0f926bcc63a5f090e66e337f65aa4393bc28. Sep 9 00:14:17.911802 systemd[1]: Started cri-containerd-7fb35fd4e114589274c1fd9155fd34e937642ff0275d4e9801c960529f91d873.scope - libcontainer container 7fb35fd4e114589274c1fd9155fd34e937642ff0275d4e9801c960529f91d873. Sep 9 00:14:17.913453 systemd[1]: Started cri-containerd-862b3fbadd0f25b4d22ca991756e4570b59a2929570bb9ecbf8b28d1e84360a6.scope - libcontainer container 862b3fbadd0f25b4d22ca991756e4570b59a2929570bb9ecbf8b28d1e84360a6. Sep 9 00:14:17.935981 kubelet[2546]: W0909 00:14:17.935677 2546 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 9 00:14:17.935981 kubelet[2546]: E0909 00:14:17.935726 2546 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:17.969289 containerd[1638]: time="2025-09-09T00:14:17.968977487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:1b664c63a6b959c5a3525fce1b6e62fd,Namespace:kube-system,Attempt:0,} returns sandbox id \"862b3fbadd0f25b4d22ca991756e4570b59a2929570bb9ecbf8b28d1e84360a6\"" Sep 9 00:14:17.973395 containerd[1638]: time="2025-09-09T00:14:17.973345254Z" level=info msg="CreateContainer within sandbox \"862b3fbadd0f25b4d22ca991756e4570b59a2929570bb9ecbf8b28d1e84360a6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 00:14:17.980677 containerd[1638]: time="2025-09-09T00:14:17.980615927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,} returns sandbox id \"0cf51971a222a16020dd9e3066cd0f926bcc63a5f090e66e337f65aa4393bc28\"" Sep 9 00:14:17.982456 containerd[1638]: time="2025-09-09T00:14:17.982431871Z" level=info msg="CreateContainer within sandbox \"0cf51971a222a16020dd9e3066cd0f926bcc63a5f090e66e337f65aa4393bc28\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 00:14:17.982588 containerd[1638]: time="2025-09-09T00:14:17.982542697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,} returns sandbox id \"7fb35fd4e114589274c1fd9155fd34e937642ff0275d4e9801c960529f91d873\"" Sep 9 00:14:17.984201 containerd[1638]: time="2025-09-09T00:14:17.984144279Z" level=info msg="Container fee514e41b4d07c2456d8db805e08f41b7c6283c1e7d2b9beb2b14b0c83d5cdb: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:14:17.985281 containerd[1638]: time="2025-09-09T00:14:17.984864938Z" level=info msg="CreateContainer within sandbox \"7fb35fd4e114589274c1fd9155fd34e937642ff0275d4e9801c960529f91d873\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 00:14:17.988876 containerd[1638]: time="2025-09-09T00:14:17.988838742Z" level=info msg="Container 475a873e9be39b43366ce4b446641f71174b1abab30494b2c77522fbeee77043: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:14:17.990067 containerd[1638]: time="2025-09-09T00:14:17.990054236Z" level=info msg="CreateContainer within sandbox \"862b3fbadd0f25b4d22ca991756e4570b59a2929570bb9ecbf8b28d1e84360a6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fee514e41b4d07c2456d8db805e08f41b7c6283c1e7d2b9beb2b14b0c83d5cdb\"" Sep 9 00:14:17.990608 containerd[1638]: time="2025-09-09T00:14:17.990593419Z" level=info msg="StartContainer for \"fee514e41b4d07c2456d8db805e08f41b7c6283c1e7d2b9beb2b14b0c83d5cdb\"" Sep 9 00:14:17.991283 containerd[1638]: time="2025-09-09T00:14:17.991266965Z" level=info msg="connecting to shim fee514e41b4d07c2456d8db805e08f41b7c6283c1e7d2b9beb2b14b0c83d5cdb" address="unix:///run/containerd/s/40303e7bb44c60f7db4224536cc418a580884f004bfe58abaf0932e070211252" protocol=ttrpc version=3 Sep 9 00:14:17.996853 containerd[1638]: time="2025-09-09T00:14:17.996788176Z" level=info msg="CreateContainer within sandbox \"0cf51971a222a16020dd9e3066cd0f926bcc63a5f090e66e337f65aa4393bc28\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"475a873e9be39b43366ce4b446641f71174b1abab30494b2c77522fbeee77043\"" Sep 9 00:14:17.997232 containerd[1638]: time="2025-09-09T00:14:17.997200169Z" level=info msg="StartContainer for \"475a873e9be39b43366ce4b446641f71174b1abab30494b2c77522fbeee77043\"" Sep 9 00:14:17.997865 containerd[1638]: time="2025-09-09T00:14:17.997535774Z" level=info msg="Container e80c2c108194dd5339a97e9760ad8666cd356127e9201089af682e2e3b305fef: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:14:17.997865 containerd[1638]: time="2025-09-09T00:14:17.997821751Z" level=info msg="connecting to shim 475a873e9be39b43366ce4b446641f71174b1abab30494b2c77522fbeee77043" address="unix:///run/containerd/s/c712b8d946b1375b006e1bcfb77093c933f20b3dda7c98ddf98431fbd78b886b" protocol=ttrpc version=3 Sep 9 00:14:18.001105 containerd[1638]: time="2025-09-09T00:14:18.001087473Z" level=info msg="CreateContainer within sandbox \"7fb35fd4e114589274c1fd9155fd34e937642ff0275d4e9801c960529f91d873\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e80c2c108194dd5339a97e9760ad8666cd356127e9201089af682e2e3b305fef\"" Sep 9 00:14:18.001987 containerd[1638]: time="2025-09-09T00:14:18.001972207Z" level=info msg="StartContainer for \"e80c2c108194dd5339a97e9760ad8666cd356127e9201089af682e2e3b305fef\"" Sep 9 00:14:18.003718 containerd[1638]: time="2025-09-09T00:14:18.003698466Z" level=info msg="connecting to shim e80c2c108194dd5339a97e9760ad8666cd356127e9201089af682e2e3b305fef" address="unix:///run/containerd/s/18e03fea5bce8354722b112c137d0689ce2c80ab967288a872093e234238f90a" protocol=ttrpc version=3 Sep 9 00:14:18.009226 systemd[1]: Started cri-containerd-fee514e41b4d07c2456d8db805e08f41b7c6283c1e7d2b9beb2b14b0c83d5cdb.scope - libcontainer container fee514e41b4d07c2456d8db805e08f41b7c6283c1e7d2b9beb2b14b0c83d5cdb. Sep 9 00:14:18.016224 systemd[1]: Started cri-containerd-475a873e9be39b43366ce4b446641f71174b1abab30494b2c77522fbeee77043.scope - libcontainer container 475a873e9be39b43366ce4b446641f71174b1abab30494b2c77522fbeee77043. Sep 9 00:14:18.021075 systemd[1]: Started cri-containerd-e80c2c108194dd5339a97e9760ad8666cd356127e9201089af682e2e3b305fef.scope - libcontainer container e80c2c108194dd5339a97e9760ad8666cd356127e9201089af682e2e3b305fef. Sep 9 00:14:18.080013 kubelet[2546]: W0909 00:14:18.079880 2546 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 9 00:14:18.080013 kubelet[2546]: E0909 00:14:18.079958 2546 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:18.086422 containerd[1638]: time="2025-09-09T00:14:18.086396718Z" level=info msg="StartContainer for \"fee514e41b4d07c2456d8db805e08f41b7c6283c1e7d2b9beb2b14b0c83d5cdb\" returns successfully" Sep 9 00:14:18.094640 containerd[1638]: time="2025-09-09T00:14:18.094578352Z" level=info msg="StartContainer for \"e80c2c108194dd5339a97e9760ad8666cd356127e9201089af682e2e3b305fef\" returns successfully" Sep 9 00:14:18.095123 containerd[1638]: time="2025-09-09T00:14:18.095017125Z" level=info msg="StartContainer for \"475a873e9be39b43366ce4b446641f71174b1abab30494b2c77522fbeee77043\" returns successfully" Sep 9 00:14:18.110241 kubelet[2546]: E0909 00:14:18.110223 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:18.113309 kubelet[2546]: E0909 00:14:18.113237 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:18.115307 kubelet[2546]: E0909 00:14:18.115298 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:18.286920 kubelet[2546]: W0909 00:14:18.286864 2546 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 9 00:14:18.286920 kubelet[2546]: E0909 00:14:18.286905 2546 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:18.359128 kubelet[2546]: W0909 00:14:18.359085 2546 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Sep 9 00:14:18.359260 kubelet[2546]: E0909 00:14:18.359242 2546 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:18.435185 kubelet[2546]: E0909 00:14:18.433111 2546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="1.6s" Sep 9 00:14:18.693263 kubelet[2546]: I0909 00:14:18.693191 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 00:14:18.693777 kubelet[2546]: E0909 00:14:18.693763 2546 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Sep 9 00:14:19.014815 kubelet[2546]: E0909 00:14:19.014739 2546 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.104:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.104:6443: connect: connection refused" logger="UnhandledError" Sep 9 00:14:19.117261 kubelet[2546]: E0909 00:14:19.117241 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:19.117676 kubelet[2546]: E0909 00:14:19.117669 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:20.037857 kubelet[2546]: E0909 00:14:20.037823 2546 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 00:14:20.213797 kubelet[2546]: E0909 00:14:20.213702 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:20.297932 kubelet[2546]: I0909 00:14:20.296681 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 00:14:20.339255 kubelet[2546]: I0909 00:14:20.339235 2546 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 00:14:20.339363 kubelet[2546]: E0909 00:14:20.339356 2546 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 9 00:14:20.350368 kubelet[2546]: E0909 00:14:20.350351 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:20.451101 kubelet[2546]: E0909 00:14:20.451075 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:20.551999 kubelet[2546]: E0909 00:14:20.551923 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:20.652782 kubelet[2546]: E0909 00:14:20.652761 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:20.753424 kubelet[2546]: E0909 00:14:20.753398 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:20.854207 kubelet[2546]: E0909 00:14:20.854113 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:20.912660 kubelet[2546]: E0909 00:14:20.912631 2546 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 00:14:20.954397 kubelet[2546]: E0909 00:14:20.954350 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:21.055452 kubelet[2546]: E0909 00:14:21.055424 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 00:14:21.222515 kubelet[2546]: I0909 00:14:21.222432 2546 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:21.229446 kubelet[2546]: I0909 00:14:21.229426 2546 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:21.233526 kubelet[2546]: I0909 00:14:21.233504 2546 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 00:14:21.658522 systemd[1]: Reload requested from client PID 2813 ('systemctl') (unit session-9.scope)... Sep 9 00:14:21.658697 systemd[1]: Reloading... Sep 9 00:14:21.716142 zram_generator::config[2863]: No configuration found. Sep 9 00:14:21.777618 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 9 00:14:21.786255 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 9 00:14:21.862810 systemd[1]: Reloading finished in 203 ms. Sep 9 00:14:21.881876 kubelet[2546]: I0909 00:14:21.881856 2546 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 00:14:21.882138 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:14:21.897034 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 00:14:21.897238 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:14:21.897277 systemd[1]: kubelet.service: Consumed 516ms CPU time, 126.6M memory peak. Sep 9 00:14:21.898520 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 00:14:22.671921 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 00:14:22.677324 (kubelet)[2924]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 00:14:22.758859 kubelet[2924]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 00:14:22.758859 kubelet[2924]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 00:14:22.758859 kubelet[2924]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 00:14:22.758859 kubelet[2924]: I0909 00:14:22.758782 2924 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 00:14:22.764912 kubelet[2924]: I0909 00:14:22.764483 2924 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 00:14:22.764912 kubelet[2924]: I0909 00:14:22.764500 2924 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 00:14:22.764912 kubelet[2924]: I0909 00:14:22.764675 2924 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 00:14:22.765464 kubelet[2924]: I0909 00:14:22.765451 2924 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 00:14:22.767373 kubelet[2924]: I0909 00:14:22.766992 2924 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 00:14:22.769598 kubelet[2924]: I0909 00:14:22.769579 2924 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 00:14:22.772078 kubelet[2924]: I0909 00:14:22.771953 2924 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 00:14:22.776017 kubelet[2924]: I0909 00:14:22.775691 2924 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 00:14:22.776017 kubelet[2924]: I0909 00:14:22.775727 2924 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 00:14:22.776017 kubelet[2924]: I0909 00:14:22.775885 2924 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 00:14:22.776017 kubelet[2924]: I0909 00:14:22.775893 2924 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 00:14:22.776192 kubelet[2924]: I0909 00:14:22.775932 2924 state_mem.go:36] "Initialized new in-memory state store" Sep 9 00:14:22.776192 kubelet[2924]: I0909 00:14:22.776070 2924 kubelet.go:446] "Attempting to sync node with API server" Sep 9 00:14:22.776192 kubelet[2924]: I0909 00:14:22.776083 2924 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 00:14:22.776192 kubelet[2924]: I0909 00:14:22.776096 2924 kubelet.go:352] "Adding apiserver pod source" Sep 9 00:14:22.776192 kubelet[2924]: I0909 00:14:22.776102 2924 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 00:14:22.777872 kubelet[2924]: I0909 00:14:22.777776 2924 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 9 00:14:22.780038 kubelet[2924]: I0909 00:14:22.780027 2924 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 00:14:22.780330 kubelet[2924]: I0909 00:14:22.780323 2924 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 00:14:22.780385 kubelet[2924]: I0909 00:14:22.780381 2924 server.go:1287] "Started kubelet" Sep 9 00:14:22.784340 kubelet[2924]: I0909 00:14:22.783770 2924 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 00:14:22.800845 kubelet[2924]: I0909 00:14:22.800821 2924 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 00:14:22.803065 kubelet[2924]: I0909 00:14:22.802665 2924 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 00:14:22.803065 kubelet[2924]: I0909 00:14:22.802838 2924 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 00:14:22.803065 kubelet[2924]: I0909 00:14:22.802958 2924 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 00:14:22.803497 kubelet[2924]: I0909 00:14:22.803364 2924 server.go:479] "Adding debug handlers to kubelet server" Sep 9 00:14:22.805529 kubelet[2924]: I0909 00:14:22.805432 2924 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 00:14:22.805529 kubelet[2924]: I0909 00:14:22.805487 2924 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 00:14:22.805625 kubelet[2924]: I0909 00:14:22.805539 2924 reconciler.go:26] "Reconciler: start to sync state" Sep 9 00:14:22.807815 kubelet[2924]: E0909 00:14:22.807305 2924 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 00:14:22.809104 kubelet[2924]: I0909 00:14:22.809084 2924 factory.go:221] Registration of the containerd container factory successfully Sep 9 00:14:22.809104 kubelet[2924]: I0909 00:14:22.809096 2924 factory.go:221] Registration of the systemd container factory successfully Sep 9 00:14:22.809228 kubelet[2924]: I0909 00:14:22.809173 2924 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 00:14:22.813184 kubelet[2924]: I0909 00:14:22.812906 2924 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 00:14:22.813648 kubelet[2924]: I0909 00:14:22.813629 2924 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 00:14:22.813648 kubelet[2924]: I0909 00:14:22.813648 2924 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 00:14:22.813705 kubelet[2924]: I0909 00:14:22.813661 2924 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 00:14:22.813705 kubelet[2924]: I0909 00:14:22.813668 2924 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 00:14:22.813743 kubelet[2924]: E0909 00:14:22.813697 2924 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 00:14:22.852831 kubelet[2924]: I0909 00:14:22.852787 2924 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 00:14:22.852831 kubelet[2924]: I0909 00:14:22.852799 2924 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 00:14:22.852831 kubelet[2924]: I0909 00:14:22.852811 2924 state_mem.go:36] "Initialized new in-memory state store" Sep 9 00:14:22.853046 kubelet[2924]: I0909 00:14:22.853038 2924 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 00:14:22.853096 kubelet[2924]: I0909 00:14:22.853083 2924 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 00:14:22.853160 kubelet[2924]: I0909 00:14:22.853155 2924 policy_none.go:49] "None policy: Start" Sep 9 00:14:22.853230 kubelet[2924]: I0909 00:14:22.853224 2924 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 00:14:22.853262 kubelet[2924]: I0909 00:14:22.853258 2924 state_mem.go:35] "Initializing new in-memory state store" Sep 9 00:14:22.853350 kubelet[2924]: I0909 00:14:22.853345 2924 state_mem.go:75] "Updated machine memory state" Sep 9 00:14:22.857254 kubelet[2924]: I0909 00:14:22.857240 2924 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 00:14:22.857424 kubelet[2924]: I0909 00:14:22.857417 2924 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 00:14:22.857486 kubelet[2924]: I0909 00:14:22.857468 2924 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 00:14:22.858214 kubelet[2924]: I0909 00:14:22.858206 2924 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 00:14:22.863436 kubelet[2924]: E0909 00:14:22.863064 2924 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 00:14:22.914701 kubelet[2924]: I0909 00:14:22.914407 2924 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:22.917220 kubelet[2924]: I0909 00:14:22.916648 2924 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:22.917828 kubelet[2924]: I0909 00:14:22.916937 2924 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 00:14:22.918957 kubelet[2924]: E0909 00:14:22.918937 2924 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:22.920455 kubelet[2924]: E0909 00:14:22.920407 2924 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 9 00:14:22.920969 kubelet[2924]: E0909 00:14:22.920895 2924 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:22.962595 kubelet[2924]: I0909 00:14:22.962071 2924 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 00:14:23.048407 kubelet[2924]: I0909 00:14:23.048387 2924 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 9 00:14:23.048624 kubelet[2924]: I0909 00:14:23.048542 2924 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 00:14:23.107395 kubelet[2924]: I0909 00:14:23.107360 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 9 00:14:23.107935 kubelet[2924]: I0909 00:14:23.107856 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1b664c63a6b959c5a3525fce1b6e62fd-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"1b664c63a6b959c5a3525fce1b6e62fd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:23.107935 kubelet[2924]: I0909 00:14:23.107875 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1b664c63a6b959c5a3525fce1b6e62fd-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"1b664c63a6b959c5a3525fce1b6e62fd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:23.107935 kubelet[2924]: I0909 00:14:23.107887 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:23.108142 kubelet[2924]: I0909 00:14:23.107997 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:23.108142 kubelet[2924]: I0909 00:14:23.108013 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1b664c63a6b959c5a3525fce1b6e62fd-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"1b664c63a6b959c5a3525fce1b6e62fd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:23.108142 kubelet[2924]: I0909 00:14:23.108023 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:23.108142 kubelet[2924]: I0909 00:14:23.108031 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:23.108142 kubelet[2924]: I0909 00:14:23.108040 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:23.784233 kubelet[2924]: I0909 00:14:23.784204 2924 apiserver.go:52] "Watching apiserver" Sep 9 00:14:23.806001 kubelet[2924]: I0909 00:14:23.805969 2924 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 00:14:23.841654 kubelet[2924]: I0909 00:14:23.841434 2924 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:23.841654 kubelet[2924]: I0909 00:14:23.841580 2924 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:23.845763 kubelet[2924]: E0909 00:14:23.845744 2924 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 9 00:14:23.847490 kubelet[2924]: E0909 00:14:23.847477 2924 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 00:14:23.862051 kubelet[2924]: I0909 00:14:23.861954 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=2.8619389699999997 podStartE2EDuration="2.86193897s" podCreationTimestamp="2025-09-09 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 00:14:23.856879676 +0000 UTC m=+1.135050434" watchObservedRunningTime="2025-09-09 00:14:23.86193897 +0000 UTC m=+1.140109728" Sep 9 00:14:23.869093 kubelet[2924]: I0909 00:14:23.868797 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.868762613 podStartE2EDuration="2.868762613s" podCreationTimestamp="2025-09-09 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 00:14:23.862391928 +0000 UTC m=+1.140562677" watchObservedRunningTime="2025-09-09 00:14:23.868762613 +0000 UTC m=+1.146933360" Sep 9 00:14:23.869355 kubelet[2924]: I0909 00:14:23.869258 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.86924782 podStartE2EDuration="2.86924782s" podCreationTimestamp="2025-09-09 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 00:14:23.867975869 +0000 UTC m=+1.146146626" watchObservedRunningTime="2025-09-09 00:14:23.86924782 +0000 UTC m=+1.147418565" Sep 9 00:14:27.650096 kubelet[2924]: I0909 00:14:27.650072 2924 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 00:14:27.650509 kubelet[2924]: I0909 00:14:27.650370 2924 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 00:14:27.650534 containerd[1638]: time="2025-09-09T00:14:27.650287527Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 00:14:28.257268 systemd[1]: Created slice kubepods-besteffort-pod9ef812e8_2c6c_4bad_a18b_3601bce75fff.slice - libcontainer container kubepods-besteffort-pod9ef812e8_2c6c_4bad_a18b_3601bce75fff.slice. Sep 9 00:14:28.341195 kubelet[2924]: I0909 00:14:28.341050 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/9ef812e8-2c6c-4bad-a18b-3601bce75fff-kube-proxy\") pod \"kube-proxy-489vk\" (UID: \"9ef812e8-2c6c-4bad-a18b-3601bce75fff\") " pod="kube-system/kube-proxy-489vk" Sep 9 00:14:28.341195 kubelet[2924]: I0909 00:14:28.341078 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ef812e8-2c6c-4bad-a18b-3601bce75fff-lib-modules\") pod \"kube-proxy-489vk\" (UID: \"9ef812e8-2c6c-4bad-a18b-3601bce75fff\") " pod="kube-system/kube-proxy-489vk" Sep 9 00:14:28.341195 kubelet[2924]: I0909 00:14:28.341090 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9ef812e8-2c6c-4bad-a18b-3601bce75fff-xtables-lock\") pod \"kube-proxy-489vk\" (UID: \"9ef812e8-2c6c-4bad-a18b-3601bce75fff\") " pod="kube-system/kube-proxy-489vk" Sep 9 00:14:28.341427 kubelet[2924]: I0909 00:14:28.341101 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp5lp\" (UniqueName: \"kubernetes.io/projected/9ef812e8-2c6c-4bad-a18b-3601bce75fff-kube-api-access-pp5lp\") pod \"kube-proxy-489vk\" (UID: \"9ef812e8-2c6c-4bad-a18b-3601bce75fff\") " pod="kube-system/kube-proxy-489vk" Sep 9 00:14:28.578293 containerd[1638]: time="2025-09-09T00:14:28.578265273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-489vk,Uid:9ef812e8-2c6c-4bad-a18b-3601bce75fff,Namespace:kube-system,Attempt:0,}" Sep 9 00:14:28.592417 containerd[1638]: time="2025-09-09T00:14:28.592353748Z" level=info msg="connecting to shim bb4b9cd7289df57bb2cec586cac5b591b73baadea58b18859a11096bb356e013" address="unix:///run/containerd/s/05e5e374390ded914da96dfd961daadb1d1a15e9b8599aeb51aae2a2e6e3a991" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:14:28.617251 systemd[1]: Started cri-containerd-bb4b9cd7289df57bb2cec586cac5b591b73baadea58b18859a11096bb356e013.scope - libcontainer container bb4b9cd7289df57bb2cec586cac5b591b73baadea58b18859a11096bb356e013. Sep 9 00:14:28.637978 containerd[1638]: time="2025-09-09T00:14:28.637953408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-489vk,Uid:9ef812e8-2c6c-4bad-a18b-3601bce75fff,Namespace:kube-system,Attempt:0,} returns sandbox id \"bb4b9cd7289df57bb2cec586cac5b591b73baadea58b18859a11096bb356e013\"" Sep 9 00:14:28.640124 containerd[1638]: time="2025-09-09T00:14:28.640105281Z" level=info msg="CreateContainer within sandbox \"bb4b9cd7289df57bb2cec586cac5b591b73baadea58b18859a11096bb356e013\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 00:14:28.671025 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2357829934.mount: Deactivated successfully. Sep 9 00:14:28.674711 containerd[1638]: time="2025-09-09T00:14:28.674681398Z" level=info msg="Container 91c461f4e8e5e0e56326df0b23c26e7c0a29c8200af65f37e43aec6286c7d975: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:14:28.692993 containerd[1638]: time="2025-09-09T00:14:28.692949273Z" level=info msg="CreateContainer within sandbox \"bb4b9cd7289df57bb2cec586cac5b591b73baadea58b18859a11096bb356e013\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"91c461f4e8e5e0e56326df0b23c26e7c0a29c8200af65f37e43aec6286c7d975\"" Sep 9 00:14:28.694435 containerd[1638]: time="2025-09-09T00:14:28.694374806Z" level=info msg="StartContainer for \"91c461f4e8e5e0e56326df0b23c26e7c0a29c8200af65f37e43aec6286c7d975\"" Sep 9 00:14:28.695353 containerd[1638]: time="2025-09-09T00:14:28.695337537Z" level=info msg="connecting to shim 91c461f4e8e5e0e56326df0b23c26e7c0a29c8200af65f37e43aec6286c7d975" address="unix:///run/containerd/s/05e5e374390ded914da96dfd961daadb1d1a15e9b8599aeb51aae2a2e6e3a991" protocol=ttrpc version=3 Sep 9 00:14:28.708613 systemd[1]: Created slice kubepods-besteffort-podbc3ab54c_fd59_4707_9850_e79bfeff6c79.slice - libcontainer container kubepods-besteffort-podbc3ab54c_fd59_4707_9850_e79bfeff6c79.slice. Sep 9 00:14:28.725312 systemd[1]: Started cri-containerd-91c461f4e8e5e0e56326df0b23c26e7c0a29c8200af65f37e43aec6286c7d975.scope - libcontainer container 91c461f4e8e5e0e56326df0b23c26e7c0a29c8200af65f37e43aec6286c7d975. Sep 9 00:14:28.743495 kubelet[2924]: I0909 00:14:28.743442 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2v74\" (UniqueName: \"kubernetes.io/projected/bc3ab54c-fd59-4707-9850-e79bfeff6c79-kube-api-access-c2v74\") pod \"tigera-operator-755d956888-qncs2\" (UID: \"bc3ab54c-fd59-4707-9850-e79bfeff6c79\") " pod="tigera-operator/tigera-operator-755d956888-qncs2" Sep 9 00:14:28.743495 kubelet[2924]: I0909 00:14:28.743466 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/bc3ab54c-fd59-4707-9850-e79bfeff6c79-var-lib-calico\") pod \"tigera-operator-755d956888-qncs2\" (UID: \"bc3ab54c-fd59-4707-9850-e79bfeff6c79\") " pod="tigera-operator/tigera-operator-755d956888-qncs2" Sep 9 00:14:28.753287 containerd[1638]: time="2025-09-09T00:14:28.753265988Z" level=info msg="StartContainer for \"91c461f4e8e5e0e56326df0b23c26e7c0a29c8200af65f37e43aec6286c7d975\" returns successfully" Sep 9 00:14:29.013918 containerd[1638]: time="2025-09-09T00:14:29.013814091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-qncs2,Uid:bc3ab54c-fd59-4707-9850-e79bfeff6c79,Namespace:tigera-operator,Attempt:0,}" Sep 9 00:14:29.058479 containerd[1638]: time="2025-09-09T00:14:29.058148155Z" level=info msg="connecting to shim 7b450a97498cd07a503c8c261e116cbb3a24858b312a68181ea16dfaacba1928" address="unix:///run/containerd/s/d5fa9bc9b49d0f9f89684a90cc48f3c1632981ac1f68a0d35e93cb358c8b0273" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:14:29.079335 systemd[1]: Started cri-containerd-7b450a97498cd07a503c8c261e116cbb3a24858b312a68181ea16dfaacba1928.scope - libcontainer container 7b450a97498cd07a503c8c261e116cbb3a24858b312a68181ea16dfaacba1928. Sep 9 00:14:29.122995 containerd[1638]: time="2025-09-09T00:14:29.122968618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-qncs2,Uid:bc3ab54c-fd59-4707-9850-e79bfeff6c79,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"7b450a97498cd07a503c8c261e116cbb3a24858b312a68181ea16dfaacba1928\"" Sep 9 00:14:29.124766 containerd[1638]: time="2025-09-09T00:14:29.124749811Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 00:14:31.144450 kubelet[2924]: I0909 00:14:31.144182 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-489vk" podStartSLOduration=3.144171107 podStartE2EDuration="3.144171107s" podCreationTimestamp="2025-09-09 00:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 00:14:28.860024424 +0000 UTC m=+6.138195181" watchObservedRunningTime="2025-09-09 00:14:31.144171107 +0000 UTC m=+8.422341859" Sep 9 00:14:32.196659 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3849827294.mount: Deactivated successfully. Sep 9 00:14:32.785919 containerd[1638]: time="2025-09-09T00:14:32.785848455Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:32.786459 containerd[1638]: time="2025-09-09T00:14:32.786230947Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 00:14:32.786733 containerd[1638]: time="2025-09-09T00:14:32.786719457Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:32.788074 containerd[1638]: time="2025-09-09T00:14:32.787806878Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:32.788311 containerd[1638]: time="2025-09-09T00:14:32.788298008Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.663530305s" Sep 9 00:14:32.788364 containerd[1638]: time="2025-09-09T00:14:32.788355907Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 00:14:32.790262 containerd[1638]: time="2025-09-09T00:14:32.790235934Z" level=info msg="CreateContainer within sandbox \"7b450a97498cd07a503c8c261e116cbb3a24858b312a68181ea16dfaacba1928\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 00:14:32.794947 containerd[1638]: time="2025-09-09T00:14:32.794655528Z" level=info msg="Container 89864a0111907c6199488991560c1db65aa60a1c28d720827444ff351872a542: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:14:32.799355 containerd[1638]: time="2025-09-09T00:14:32.799327911Z" level=info msg="CreateContainer within sandbox \"7b450a97498cd07a503c8c261e116cbb3a24858b312a68181ea16dfaacba1928\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"89864a0111907c6199488991560c1db65aa60a1c28d720827444ff351872a542\"" Sep 9 00:14:32.799832 containerd[1638]: time="2025-09-09T00:14:32.799811256Z" level=info msg="StartContainer for \"89864a0111907c6199488991560c1db65aa60a1c28d720827444ff351872a542\"" Sep 9 00:14:32.800500 containerd[1638]: time="2025-09-09T00:14:32.800458966Z" level=info msg="connecting to shim 89864a0111907c6199488991560c1db65aa60a1c28d720827444ff351872a542" address="unix:///run/containerd/s/d5fa9bc9b49d0f9f89684a90cc48f3c1632981ac1f68a0d35e93cb358c8b0273" protocol=ttrpc version=3 Sep 9 00:14:32.818261 systemd[1]: Started cri-containerd-89864a0111907c6199488991560c1db65aa60a1c28d720827444ff351872a542.scope - libcontainer container 89864a0111907c6199488991560c1db65aa60a1c28d720827444ff351872a542. Sep 9 00:14:32.838572 containerd[1638]: time="2025-09-09T00:14:32.838543912Z" level=info msg="StartContainer for \"89864a0111907c6199488991560c1db65aa60a1c28d720827444ff351872a542\" returns successfully" Sep 9 00:14:33.011843 kubelet[2924]: I0909 00:14:33.011738 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-qncs2" podStartSLOduration=1.344700551 podStartE2EDuration="5.009789806s" podCreationTimestamp="2025-09-09 00:14:28 +0000 UTC" firstStartedPulling="2025-09-09 00:14:29.123805414 +0000 UTC m=+6.401976162" lastFinishedPulling="2025-09-09 00:14:32.788894671 +0000 UTC m=+10.067065417" observedRunningTime="2025-09-09 00:14:32.863245167 +0000 UTC m=+10.141415926" watchObservedRunningTime="2025-09-09 00:14:33.009789806 +0000 UTC m=+10.287960552" Sep 9 00:14:38.227213 sudo[1958]: pam_unix(sudo:session): session closed for user root Sep 9 00:14:38.235403 sshd-session[1955]: pam_unix(sshd:session): session closed for user core Sep 9 00:14:38.235875 sshd[1957]: Connection closed by 139.178.68.195 port 60356 Sep 9 00:14:38.239311 systemd-logind[1603]: Session 9 logged out. Waiting for processes to exit. Sep 9 00:14:38.240986 systemd[1]: sshd@6-139.178.70.104:22-139.178.68.195:60356.service: Deactivated successfully. Sep 9 00:14:38.243987 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 00:14:38.244461 systemd[1]: session-9.scope: Consumed 2.774s CPU time, 152.2M memory peak. Sep 9 00:14:38.247920 systemd-logind[1603]: Removed session 9. Sep 9 00:14:40.742555 systemd[1]: Created slice kubepods-besteffort-pode7611204_1f8d_4f00_9794_1bd21461f56f.slice - libcontainer container kubepods-besteffort-pode7611204_1f8d_4f00_9794_1bd21461f56f.slice. Sep 9 00:14:40.821750 kubelet[2924]: I0909 00:14:40.821639 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4q6\" (UniqueName: \"kubernetes.io/projected/e7611204-1f8d-4f00-9794-1bd21461f56f-kube-api-access-9k4q6\") pod \"calico-typha-694c8bc46b-8nt59\" (UID: \"e7611204-1f8d-4f00-9794-1bd21461f56f\") " pod="calico-system/calico-typha-694c8bc46b-8nt59" Sep 9 00:14:40.821750 kubelet[2924]: I0909 00:14:40.821689 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7611204-1f8d-4f00-9794-1bd21461f56f-tigera-ca-bundle\") pod \"calico-typha-694c8bc46b-8nt59\" (UID: \"e7611204-1f8d-4f00-9794-1bd21461f56f\") " pod="calico-system/calico-typha-694c8bc46b-8nt59" Sep 9 00:14:40.821750 kubelet[2924]: I0909 00:14:40.821708 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e7611204-1f8d-4f00-9794-1bd21461f56f-typha-certs\") pod \"calico-typha-694c8bc46b-8nt59\" (UID: \"e7611204-1f8d-4f00-9794-1bd21461f56f\") " pod="calico-system/calico-typha-694c8bc46b-8nt59" Sep 9 00:14:41.018669 systemd[1]: Created slice kubepods-besteffort-pode9c06b72_9487_4815_bfb9_7e4e01054afa.slice - libcontainer container kubepods-besteffort-pode9c06b72_9487_4815_bfb9_7e4e01054afa.slice. Sep 9 00:14:41.022906 kubelet[2924]: I0909 00:14:41.022876 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5f8\" (UniqueName: \"kubernetes.io/projected/e9c06b72-9487-4815-bfb9-7e4e01054afa-kube-api-access-lc5f8\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.022906 kubelet[2924]: I0909 00:14:41.022907 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9c06b72-9487-4815-bfb9-7e4e01054afa-tigera-ca-bundle\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023035 kubelet[2924]: I0909 00:14:41.022920 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-cni-net-dir\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023035 kubelet[2924]: I0909 00:14:41.022931 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-lib-modules\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023035 kubelet[2924]: I0909 00:14:41.022941 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-policysync\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023035 kubelet[2924]: I0909 00:14:41.022953 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-cni-log-dir\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023035 kubelet[2924]: I0909 00:14:41.022963 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-flexvol-driver-host\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023172 kubelet[2924]: I0909 00:14:41.022972 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e9c06b72-9487-4815-bfb9-7e4e01054afa-node-certs\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023172 kubelet[2924]: I0909 00:14:41.022983 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-var-lib-calico\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023172 kubelet[2924]: I0909 00:14:41.022997 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-cni-bin-dir\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023172 kubelet[2924]: I0909 00:14:41.023012 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-xtables-lock\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.023172 kubelet[2924]: I0909 00:14:41.023023 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e9c06b72-9487-4815-bfb9-7e4e01054afa-var-run-calico\") pod \"calico-node-8nxsr\" (UID: \"e9c06b72-9487-4815-bfb9-7e4e01054afa\") " pod="calico-system/calico-node-8nxsr" Sep 9 00:14:41.049837 containerd[1638]: time="2025-09-09T00:14:41.049814213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-694c8bc46b-8nt59,Uid:e7611204-1f8d-4f00-9794-1bd21461f56f,Namespace:calico-system,Attempt:0,}" Sep 9 00:14:41.107158 containerd[1638]: time="2025-09-09T00:14:41.107094103Z" level=info msg="connecting to shim 8af7abe76b3ac31431878367850b7588b7a50e2b17ba55755d9e9293c67d39b1" address="unix:///run/containerd/s/bb4d1e0e6c9332651593f0a72799983fc25d7acfd07fb9c955dcbae70ce948de" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:14:41.125795 kubelet[2924]: E0909 00:14:41.125769 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.125795 kubelet[2924]: W0909 00:14:41.125799 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.125895 kubelet[2924]: E0909 00:14:41.125816 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.128297 systemd[1]: Started cri-containerd-8af7abe76b3ac31431878367850b7588b7a50e2b17ba55755d9e9293c67d39b1.scope - libcontainer container 8af7abe76b3ac31431878367850b7588b7a50e2b17ba55755d9e9293c67d39b1. Sep 9 00:14:41.139592 kubelet[2924]: E0909 00:14:41.131867 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.139592 kubelet[2924]: W0909 00:14:41.132361 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.139592 kubelet[2924]: E0909 00:14:41.132390 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.139592 kubelet[2924]: E0909 00:14:41.135589 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.139592 kubelet[2924]: W0909 00:14:41.135596 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.139592 kubelet[2924]: E0909 00:14:41.135606 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.175686 containerd[1638]: time="2025-09-09T00:14:41.175619393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-694c8bc46b-8nt59,Uid:e7611204-1f8d-4f00-9794-1bd21461f56f,Namespace:calico-system,Attempt:0,} returns sandbox id \"8af7abe76b3ac31431878367850b7588b7a50e2b17ba55755d9e9293c67d39b1\"" Sep 9 00:14:41.176867 containerd[1638]: time="2025-09-09T00:14:41.176853111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 00:14:41.236129 kubelet[2924]: E0909 00:14:41.236087 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:41.316147 kubelet[2924]: E0909 00:14:41.316105 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.316147 kubelet[2924]: W0909 00:14:41.316140 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.316286 kubelet[2924]: E0909 00:14:41.316160 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.316286 kubelet[2924]: E0909 00:14:41.316277 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.316286 kubelet[2924]: W0909 00:14:41.316282 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.316338 kubelet[2924]: E0909 00:14:41.316288 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.316385 kubelet[2924]: E0909 00:14:41.316372 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.316385 kubelet[2924]: W0909 00:14:41.316381 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.316453 kubelet[2924]: E0909 00:14:41.316387 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.316507 kubelet[2924]: E0909 00:14:41.316493 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.316507 kubelet[2924]: W0909 00:14:41.316502 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.316580 kubelet[2924]: E0909 00:14:41.316509 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.316624 kubelet[2924]: E0909 00:14:41.316605 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.316624 kubelet[2924]: W0909 00:14:41.316611 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.316624 kubelet[2924]: E0909 00:14:41.316616 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.316709 kubelet[2924]: E0909 00:14:41.316697 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.316733 kubelet[2924]: W0909 00:14:41.316709 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.316733 kubelet[2924]: E0909 00:14:41.316718 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.316804 kubelet[2924]: E0909 00:14:41.316792 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.316804 kubelet[2924]: W0909 00:14:41.316800 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.316878 kubelet[2924]: E0909 00:14:41.316807 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.316900 kubelet[2924]: E0909 00:14:41.316885 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.316900 kubelet[2924]: W0909 00:14:41.316891 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.316900 kubelet[2924]: E0909 00:14:41.316896 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317000 kubelet[2924]: E0909 00:14:41.316981 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317000 kubelet[2924]: W0909 00:14:41.316987 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.317000 kubelet[2924]: E0909 00:14:41.316992 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317079 kubelet[2924]: E0909 00:14:41.317070 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317079 kubelet[2924]: W0909 00:14:41.317077 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.317149 kubelet[2924]: E0909 00:14:41.317083 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317180 kubelet[2924]: E0909 00:14:41.317174 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317204 kubelet[2924]: W0909 00:14:41.317180 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.317204 kubelet[2924]: E0909 00:14:41.317186 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317287 kubelet[2924]: E0909 00:14:41.317274 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317287 kubelet[2924]: W0909 00:14:41.317283 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.317343 kubelet[2924]: E0909 00:14:41.317291 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317406 kubelet[2924]: E0909 00:14:41.317383 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317406 kubelet[2924]: W0909 00:14:41.317392 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.317406 kubelet[2924]: E0909 00:14:41.317397 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317485 kubelet[2924]: E0909 00:14:41.317475 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317485 kubelet[2924]: W0909 00:14:41.317479 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.317485 kubelet[2924]: E0909 00:14:41.317483 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317567 kubelet[2924]: E0909 00:14:41.317561 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317567 kubelet[2924]: W0909 00:14:41.317566 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.317627 kubelet[2924]: E0909 00:14:41.317572 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317664 kubelet[2924]: E0909 00:14:41.317644 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317664 kubelet[2924]: W0909 00:14:41.317649 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.317664 kubelet[2924]: E0909 00:14:41.317657 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.317751 kubelet[2924]: E0909 00:14:41.317738 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.317751 kubelet[2924]: W0909 00:14:41.317747 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.318229 kubelet[2924]: E0909 00:14:41.317752 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.318229 kubelet[2924]: E0909 00:14:41.317832 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.318229 kubelet[2924]: W0909 00:14:41.317836 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.318229 kubelet[2924]: E0909 00:14:41.317840 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.318229 kubelet[2924]: E0909 00:14:41.317908 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.318229 kubelet[2924]: W0909 00:14:41.317913 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.318229 kubelet[2924]: E0909 00:14:41.317918 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.318501 kubelet[2924]: E0909 00:14:41.318487 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.318501 kubelet[2924]: W0909 00:14:41.318496 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.318561 kubelet[2924]: E0909 00:14:41.318503 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.323581 containerd[1638]: time="2025-09-09T00:14:41.323544486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8nxsr,Uid:e9c06b72-9487-4815-bfb9-7e4e01054afa,Namespace:calico-system,Attempt:0,}" Sep 9 00:14:41.325365 kubelet[2924]: E0909 00:14:41.325198 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.325365 kubelet[2924]: W0909 00:14:41.325229 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.325365 kubelet[2924]: E0909 00:14:41.325252 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.325365 kubelet[2924]: I0909 00:14:41.325279 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgbdj\" (UniqueName: \"kubernetes.io/projected/2778c8f3-9140-419e-8f7e-95ecd474a55a-kube-api-access-hgbdj\") pod \"csi-node-driver-j6h4t\" (UID: \"2778c8f3-9140-419e-8f7e-95ecd474a55a\") " pod="calico-system/csi-node-driver-j6h4t" Sep 9 00:14:41.326694 kubelet[2924]: E0909 00:14:41.326638 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.326694 kubelet[2924]: W0909 00:14:41.326671 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.326837 kubelet[2924]: E0909 00:14:41.326710 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.326837 kubelet[2924]: I0909 00:14:41.326744 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2778c8f3-9140-419e-8f7e-95ecd474a55a-registration-dir\") pod \"csi-node-driver-j6h4t\" (UID: \"2778c8f3-9140-419e-8f7e-95ecd474a55a\") " pod="calico-system/csi-node-driver-j6h4t" Sep 9 00:14:41.326993 kubelet[2924]: E0909 00:14:41.326979 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.326993 kubelet[2924]: W0909 00:14:41.326989 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.327058 kubelet[2924]: E0909 00:14:41.327002 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.327058 kubelet[2924]: I0909 00:14:41.327019 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2778c8f3-9140-419e-8f7e-95ecd474a55a-varrun\") pod \"csi-node-driver-j6h4t\" (UID: \"2778c8f3-9140-419e-8f7e-95ecd474a55a\") " pod="calico-system/csi-node-driver-j6h4t" Sep 9 00:14:41.327584 kubelet[2924]: E0909 00:14:41.327562 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.327631 kubelet[2924]: W0909 00:14:41.327601 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.327631 kubelet[2924]: E0909 00:14:41.327617 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.327690 kubelet[2924]: I0909 00:14:41.327636 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2778c8f3-9140-419e-8f7e-95ecd474a55a-kubelet-dir\") pod \"csi-node-driver-j6h4t\" (UID: \"2778c8f3-9140-419e-8f7e-95ecd474a55a\") " pod="calico-system/csi-node-driver-j6h4t" Sep 9 00:14:41.328270 kubelet[2924]: E0909 00:14:41.328251 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.328270 kubelet[2924]: W0909 00:14:41.328266 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.329359 kubelet[2924]: E0909 00:14:41.328376 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.329359 kubelet[2924]: I0909 00:14:41.328412 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2778c8f3-9140-419e-8f7e-95ecd474a55a-socket-dir\") pod \"csi-node-driver-j6h4t\" (UID: \"2778c8f3-9140-419e-8f7e-95ecd474a55a\") " pod="calico-system/csi-node-driver-j6h4t" Sep 9 00:14:41.331834 kubelet[2924]: E0909 00:14:41.330367 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.331834 kubelet[2924]: W0909 00:14:41.330411 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.331834 kubelet[2924]: E0909 00:14:41.330434 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.331834 kubelet[2924]: E0909 00:14:41.330544 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.331834 kubelet[2924]: W0909 00:14:41.330550 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.331834 kubelet[2924]: E0909 00:14:41.330557 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.331834 kubelet[2924]: E0909 00:14:41.330694 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.331834 kubelet[2924]: W0909 00:14:41.330701 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.331834 kubelet[2924]: E0909 00:14:41.330708 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.331834 kubelet[2924]: E0909 00:14:41.330826 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.332280 kubelet[2924]: W0909 00:14:41.330833 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.332280 kubelet[2924]: E0909 00:14:41.330845 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.332280 kubelet[2924]: E0909 00:14:41.330979 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.332280 kubelet[2924]: W0909 00:14:41.330989 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.332280 kubelet[2924]: E0909 00:14:41.331002 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.332280 kubelet[2924]: E0909 00:14:41.331110 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.332280 kubelet[2924]: W0909 00:14:41.331150 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.332280 kubelet[2924]: E0909 00:14:41.331161 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.332280 kubelet[2924]: E0909 00:14:41.331276 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.332280 kubelet[2924]: W0909 00:14:41.331301 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.332538 kubelet[2924]: E0909 00:14:41.331309 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.332538 kubelet[2924]: E0909 00:14:41.331454 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.332538 kubelet[2924]: W0909 00:14:41.331461 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.332538 kubelet[2924]: E0909 00:14:41.331469 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.332538 kubelet[2924]: E0909 00:14:41.331593 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.332538 kubelet[2924]: W0909 00:14:41.331599 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.332538 kubelet[2924]: E0909 00:14:41.331606 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.332538 kubelet[2924]: E0909 00:14:41.331695 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.332538 kubelet[2924]: W0909 00:14:41.331701 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.332538 kubelet[2924]: E0909 00:14:41.331709 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.381462 containerd[1638]: time="2025-09-09T00:14:41.380912349Z" level=info msg="connecting to shim 5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815" address="unix:///run/containerd/s/dd1eee1ab0f1cd232cf7c7dc576d74e286f8ae217287d60b4ef0d4b5ab096b45" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:14:41.428961 kubelet[2924]: E0909 00:14:41.428932 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.428961 kubelet[2924]: W0909 00:14:41.428950 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.428961 kubelet[2924]: E0909 00:14:41.428966 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.429172 kubelet[2924]: E0909 00:14:41.429158 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.429172 kubelet[2924]: W0909 00:14:41.429169 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.429236 kubelet[2924]: E0909 00:14:41.429191 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.429356 kubelet[2924]: E0909 00:14:41.429342 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.429356 kubelet[2924]: W0909 00:14:41.429350 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.429408 kubelet[2924]: E0909 00:14:41.429361 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.429522 kubelet[2924]: E0909 00:14:41.429509 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.429522 kubelet[2924]: W0909 00:14:41.429519 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.429564 kubelet[2924]: E0909 00:14:41.429533 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.429653 kubelet[2924]: E0909 00:14:41.429642 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.429653 kubelet[2924]: W0909 00:14:41.429650 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.429691 kubelet[2924]: E0909 00:14:41.429658 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.429756 kubelet[2924]: E0909 00:14:41.429745 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.429756 kubelet[2924]: W0909 00:14:41.429752 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.429795 kubelet[2924]: E0909 00:14:41.429764 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.429876 kubelet[2924]: E0909 00:14:41.429865 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.429876 kubelet[2924]: W0909 00:14:41.429873 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.429926 kubelet[2924]: E0909 00:14:41.429881 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.430027 kubelet[2924]: E0909 00:14:41.429983 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.430027 kubelet[2924]: W0909 00:14:41.429989 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.430027 kubelet[2924]: E0909 00:14:41.429998 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.430189 kubelet[2924]: E0909 00:14:41.430134 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.430189 kubelet[2924]: W0909 00:14:41.430139 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.430189 kubelet[2924]: E0909 00:14:41.430152 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.430281 kubelet[2924]: E0909 00:14:41.430269 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.430281 kubelet[2924]: W0909 00:14:41.430277 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.430328 kubelet[2924]: E0909 00:14:41.430287 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.430423 kubelet[2924]: E0909 00:14:41.430409 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.442953 kubelet[2924]: W0909 00:14:41.430438 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.442953 kubelet[2924]: E0909 00:14:41.430450 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.442953 kubelet[2924]: E0909 00:14:41.430564 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.442953 kubelet[2924]: W0909 00:14:41.430569 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.442953 kubelet[2924]: E0909 00:14:41.430574 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.442953 kubelet[2924]: E0909 00:14:41.430707 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.442953 kubelet[2924]: W0909 00:14:41.430712 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.442953 kubelet[2924]: E0909 00:14:41.430719 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.442953 kubelet[2924]: E0909 00:14:41.430799 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.442953 kubelet[2924]: W0909 00:14:41.430804 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.432508 systemd[1]: Started cri-containerd-5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815.scope - libcontainer container 5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815. Sep 9 00:14:41.443198 kubelet[2924]: E0909 00:14:41.430808 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443198 kubelet[2924]: E0909 00:14:41.430909 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443198 kubelet[2924]: W0909 00:14:41.430915 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443198 kubelet[2924]: E0909 00:14:41.430998 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443198 kubelet[2924]: E0909 00:14:41.431096 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443198 kubelet[2924]: W0909 00:14:41.431102 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443198 kubelet[2924]: E0909 00:14:41.431158 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443198 kubelet[2924]: E0909 00:14:41.431283 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443198 kubelet[2924]: W0909 00:14:41.431287 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443198 kubelet[2924]: E0909 00:14:41.431311 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443354 kubelet[2924]: E0909 00:14:41.431418 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443354 kubelet[2924]: W0909 00:14:41.431425 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443354 kubelet[2924]: E0909 00:14:41.431478 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443354 kubelet[2924]: E0909 00:14:41.431646 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443354 kubelet[2924]: W0909 00:14:41.431652 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443354 kubelet[2924]: E0909 00:14:41.431708 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443354 kubelet[2924]: E0909 00:14:41.431743 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443354 kubelet[2924]: W0909 00:14:41.431747 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443354 kubelet[2924]: E0909 00:14:41.431752 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443354 kubelet[2924]: E0909 00:14:41.431851 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443557 kubelet[2924]: W0909 00:14:41.431855 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443557 kubelet[2924]: E0909 00:14:41.431884 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443557 kubelet[2924]: E0909 00:14:41.432051 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443557 kubelet[2924]: W0909 00:14:41.432056 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443557 kubelet[2924]: E0909 00:14:41.432063 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443557 kubelet[2924]: E0909 00:14:41.432364 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443557 kubelet[2924]: W0909 00:14:41.432369 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443557 kubelet[2924]: E0909 00:14:41.432379 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.443557 kubelet[2924]: E0909 00:14:41.432766 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.443557 kubelet[2924]: W0909 00:14:41.432771 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.443791 kubelet[2924]: E0909 00:14:41.432780 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.450450 kubelet[2924]: E0909 00:14:41.450237 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.450450 kubelet[2924]: W0909 00:14:41.450254 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.453198 kubelet[2924]: E0909 00:14:41.450568 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.477672 kubelet[2924]: E0909 00:14:41.477513 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:41.477672 kubelet[2924]: W0909 00:14:41.477527 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:41.477672 kubelet[2924]: E0909 00:14:41.477632 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:41.485021 containerd[1638]: time="2025-09-09T00:14:41.484928741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8nxsr,Uid:e9c06b72-9487-4815-bfb9-7e4e01054afa,Namespace:calico-system,Attempt:0,} returns sandbox id \"5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815\"" Sep 9 00:14:42.710229 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3307889262.mount: Deactivated successfully. Sep 9 00:14:42.820931 kubelet[2924]: E0909 00:14:42.820883 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:43.572066 containerd[1638]: time="2025-09-09T00:14:43.571669185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:43.604012 containerd[1638]: time="2025-09-09T00:14:43.603982746Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 00:14:43.636764 containerd[1638]: time="2025-09-09T00:14:43.636721890Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:43.661949 containerd[1638]: time="2025-09-09T00:14:43.661905952Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:43.662535 containerd[1638]: time="2025-09-09T00:14:43.662274813Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.48532239s" Sep 9 00:14:43.662535 containerd[1638]: time="2025-09-09T00:14:43.662308494Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 00:14:43.663143 containerd[1638]: time="2025-09-09T00:14:43.663113593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 00:14:43.679377 containerd[1638]: time="2025-09-09T00:14:43.679305195Z" level=info msg="CreateContainer within sandbox \"8af7abe76b3ac31431878367850b7588b7a50e2b17ba55755d9e9293c67d39b1\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 00:14:43.733913 containerd[1638]: time="2025-09-09T00:14:43.733224206Z" level=info msg="Container 3bc099882aa8bd920cbf87799fb160db5aaab566405d2e09ce18d6b5c6bfce08: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:14:43.735827 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1284474901.mount: Deactivated successfully. Sep 9 00:14:43.760957 containerd[1638]: time="2025-09-09T00:14:43.760749171Z" level=info msg="CreateContainer within sandbox \"8af7abe76b3ac31431878367850b7588b7a50e2b17ba55755d9e9293c67d39b1\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3bc099882aa8bd920cbf87799fb160db5aaab566405d2e09ce18d6b5c6bfce08\"" Sep 9 00:14:43.761646 containerd[1638]: time="2025-09-09T00:14:43.761620427Z" level=info msg="StartContainer for \"3bc099882aa8bd920cbf87799fb160db5aaab566405d2e09ce18d6b5c6bfce08\"" Sep 9 00:14:43.762822 containerd[1638]: time="2025-09-09T00:14:43.762800445Z" level=info msg="connecting to shim 3bc099882aa8bd920cbf87799fb160db5aaab566405d2e09ce18d6b5c6bfce08" address="unix:///run/containerd/s/bb4d1e0e6c9332651593f0a72799983fc25d7acfd07fb9c955dcbae70ce948de" protocol=ttrpc version=3 Sep 9 00:14:43.781307 systemd[1]: Started cri-containerd-3bc099882aa8bd920cbf87799fb160db5aaab566405d2e09ce18d6b5c6bfce08.scope - libcontainer container 3bc099882aa8bd920cbf87799fb160db5aaab566405d2e09ce18d6b5c6bfce08. Sep 9 00:14:43.835297 containerd[1638]: time="2025-09-09T00:14:43.835195158Z" level=info msg="StartContainer for \"3bc099882aa8bd920cbf87799fb160db5aaab566405d2e09ce18d6b5c6bfce08\" returns successfully" Sep 9 00:14:43.936684 kubelet[2924]: E0909 00:14:43.936522 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.936684 kubelet[2924]: W0909 00:14:43.936541 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.936684 kubelet[2924]: E0909 00:14:43.936558 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.938299 kubelet[2924]: E0909 00:14:43.938233 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.938299 kubelet[2924]: W0909 00:14:43.938245 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.938299 kubelet[2924]: E0909 00:14:43.938255 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.938590 kubelet[2924]: E0909 00:14:43.938523 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.938590 kubelet[2924]: W0909 00:14:43.938532 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.938590 kubelet[2924]: E0909 00:14:43.938542 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.939281 kubelet[2924]: E0909 00:14:43.939218 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.939281 kubelet[2924]: W0909 00:14:43.939228 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.939281 kubelet[2924]: E0909 00:14:43.939236 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.939498 kubelet[2924]: E0909 00:14:43.939463 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.939498 kubelet[2924]: W0909 00:14:43.939470 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.939498 kubelet[2924]: E0909 00:14:43.939476 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.939967 kubelet[2924]: E0909 00:14:43.939719 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.939967 kubelet[2924]: W0909 00:14:43.939726 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.939967 kubelet[2924]: E0909 00:14:43.939735 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.939967 kubelet[2924]: E0909 00:14:43.939857 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.939967 kubelet[2924]: W0909 00:14:43.939864 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.939967 kubelet[2924]: E0909 00:14:43.939873 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.941139 kubelet[2924]: E0909 00:14:43.940595 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.941139 kubelet[2924]: W0909 00:14:43.940605 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.941139 kubelet[2924]: E0909 00:14:43.940615 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.941398 kubelet[2924]: E0909 00:14:43.941353 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.941398 kubelet[2924]: W0909 00:14:43.941365 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.941398 kubelet[2924]: E0909 00:14:43.941374 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.941604 kubelet[2924]: E0909 00:14:43.941566 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.941604 kubelet[2924]: W0909 00:14:43.941573 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.941604 kubelet[2924]: E0909 00:14:43.941579 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.941784 kubelet[2924]: E0909 00:14:43.941776 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.941904 kubelet[2924]: W0909 00:14:43.941854 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.941904 kubelet[2924]: E0909 00:14:43.941873 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.942107 kubelet[2924]: E0909 00:14:43.942056 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.942107 kubelet[2924]: W0909 00:14:43.942066 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.942107 kubelet[2924]: E0909 00:14:43.942072 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.942867 kubelet[2924]: E0909 00:14:43.942820 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.942867 kubelet[2924]: W0909 00:14:43.942829 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.942867 kubelet[2924]: E0909 00:14:43.942838 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.943094 kubelet[2924]: E0909 00:14:43.943050 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.943094 kubelet[2924]: W0909 00:14:43.943057 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.943094 kubelet[2924]: E0909 00:14:43.943065 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.943317 kubelet[2924]: E0909 00:14:43.943266 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.943317 kubelet[2924]: W0909 00:14:43.943274 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.943317 kubelet[2924]: E0909 00:14:43.943280 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.945734 kubelet[2924]: E0909 00:14:43.945672 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.945734 kubelet[2924]: W0909 00:14:43.945686 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.945734 kubelet[2924]: E0909 00:14:43.945701 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.946408 kubelet[2924]: E0909 00:14:43.946390 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.946408 kubelet[2924]: W0909 00:14:43.946399 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.946566 kubelet[2924]: E0909 00:14:43.946515 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.946726 kubelet[2924]: E0909 00:14:43.946710 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.946726 kubelet[2924]: W0909 00:14:43.946717 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.946854 kubelet[2924]: E0909 00:14:43.946801 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.947414 kubelet[2924]: E0909 00:14:43.946986 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.947414 kubelet[2924]: W0909 00:14:43.946995 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.947578 kubelet[2924]: E0909 00:14:43.947501 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.947659 kubelet[2924]: E0909 00:14:43.947652 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.947720 kubelet[2924]: W0909 00:14:43.947695 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.947792 kubelet[2924]: E0909 00:14:43.947752 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.947848 kubelet[2924]: E0909 00:14:43.947843 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.947897 kubelet[2924]: W0909 00:14:43.947887 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.948031 kubelet[2924]: E0909 00:14:43.948007 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.948151 kubelet[2924]: E0909 00:14:43.948132 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.948151 kubelet[2924]: W0909 00:14:43.948140 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.948281 kubelet[2924]: E0909 00:14:43.948269 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.948351 kubelet[2924]: E0909 00:14:43.948331 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.948443 kubelet[2924]: W0909 00:14:43.948387 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.948443 kubelet[2924]: E0909 00:14:43.948398 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.948685 kubelet[2924]: E0909 00:14:43.948597 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.948685 kubelet[2924]: W0909 00:14:43.948673 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.948829 kubelet[2924]: E0909 00:14:43.948769 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.949203 kubelet[2924]: E0909 00:14:43.949095 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.949203 kubelet[2924]: W0909 00:14:43.949103 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.949822 kubelet[2924]: E0909 00:14:43.949110 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.949956 kubelet[2924]: E0909 00:14:43.949933 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.950012 kubelet[2924]: W0909 00:14:43.949994 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.950199 kubelet[2924]: E0909 00:14:43.950173 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.950303 kubelet[2924]: E0909 00:14:43.950284 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.950303 kubelet[2924]: W0909 00:14:43.950292 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.950439 kubelet[2924]: E0909 00:14:43.950384 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.950571 kubelet[2924]: E0909 00:14:43.950557 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.950571 kubelet[2924]: W0909 00:14:43.950564 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.950924 kubelet[2924]: E0909 00:14:43.950683 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.951019 kubelet[2924]: E0909 00:14:43.951002 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.951209 kubelet[2924]: W0909 00:14:43.951200 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.951256 kubelet[2924]: E0909 00:14:43.951248 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.951375 kubelet[2924]: E0909 00:14:43.951367 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.951417 kubelet[2924]: W0909 00:14:43.951411 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.951463 kubelet[2924]: E0909 00:14:43.951454 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.951730 kubelet[2924]: E0909 00:14:43.951641 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.951730 kubelet[2924]: W0909 00:14:43.951648 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.951730 kubelet[2924]: E0909 00:14:43.951654 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.952069 kubelet[2924]: E0909 00:14:43.952015 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.952069 kubelet[2924]: W0909 00:14:43.952036 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.952069 kubelet[2924]: E0909 00:14:43.952047 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:43.953633 kubelet[2924]: E0909 00:14:43.953586 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:43.953633 kubelet[2924]: W0909 00:14:43.953599 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:43.953633 kubelet[2924]: E0909 00:14:43.953611 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.815727 kubelet[2924]: E0909 00:14:44.815656 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:44.903198 kubelet[2924]: I0909 00:14:44.903168 2924 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 00:14:44.949788 kubelet[2924]: E0909 00:14:44.949764 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.950088 kubelet[2924]: W0909 00:14:44.949839 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.950088 kubelet[2924]: E0909 00:14:44.949854 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.950088 kubelet[2924]: E0909 00:14:44.949954 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.950088 kubelet[2924]: W0909 00:14:44.949959 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.950088 kubelet[2924]: E0909 00:14:44.949965 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.950088 kubelet[2924]: E0909 00:14:44.950050 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.950088 kubelet[2924]: W0909 00:14:44.950055 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.950088 kubelet[2924]: E0909 00:14:44.950060 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.950595 kubelet[2924]: E0909 00:14:44.950180 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.950595 kubelet[2924]: W0909 00:14:44.950184 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.950595 kubelet[2924]: E0909 00:14:44.950189 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.950595 kubelet[2924]: E0909 00:14:44.950269 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.950595 kubelet[2924]: W0909 00:14:44.950274 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.950595 kubelet[2924]: E0909 00:14:44.950278 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.951197 kubelet[2924]: E0909 00:14:44.951148 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.951197 kubelet[2924]: W0909 00:14:44.951158 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.951197 kubelet[2924]: E0909 00:14:44.951175 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.951575 kubelet[2924]: E0909 00:14:44.951559 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.951575 kubelet[2924]: W0909 00:14:44.951568 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.951575 kubelet[2924]: E0909 00:14:44.951575 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.951673 kubelet[2924]: E0909 00:14:44.951663 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.951673 kubelet[2924]: W0909 00:14:44.951670 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.951806 kubelet[2924]: E0909 00:14:44.951675 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.951806 kubelet[2924]: E0909 00:14:44.951766 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.951806 kubelet[2924]: W0909 00:14:44.951771 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.951806 kubelet[2924]: E0909 00:14:44.951775 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.951882 kubelet[2924]: E0909 00:14:44.951860 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.951882 kubelet[2924]: W0909 00:14:44.951865 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.951882 kubelet[2924]: E0909 00:14:44.951870 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.951986 kubelet[2924]: E0909 00:14:44.951940 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.951986 kubelet[2924]: W0909 00:14:44.951946 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.951986 kubelet[2924]: E0909 00:14:44.951951 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.952159 kubelet[2924]: E0909 00:14:44.952146 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.952159 kubelet[2924]: W0909 00:14:44.952155 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.952221 kubelet[2924]: E0909 00:14:44.952161 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.952842 kubelet[2924]: E0909 00:14:44.952829 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.952842 kubelet[2924]: W0909 00:14:44.952839 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.952924 kubelet[2924]: E0909 00:14:44.952846 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.953198 kubelet[2924]: E0909 00:14:44.952953 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.953198 kubelet[2924]: W0909 00:14:44.952960 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.953198 kubelet[2924]: E0909 00:14:44.952971 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.953198 kubelet[2924]: E0909 00:14:44.953147 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.953198 kubelet[2924]: W0909 00:14:44.953161 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.953198 kubelet[2924]: E0909 00:14:44.953180 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.953369 kubelet[2924]: E0909 00:14:44.953361 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.953369 kubelet[2924]: W0909 00:14:44.953367 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.953417 kubelet[2924]: E0909 00:14:44.953373 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.953586 kubelet[2924]: E0909 00:14:44.953492 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.953586 kubelet[2924]: W0909 00:14:44.953499 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.953586 kubelet[2924]: E0909 00:14:44.953506 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.953831 kubelet[2924]: E0909 00:14:44.953751 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.953831 kubelet[2924]: W0909 00:14:44.953760 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.953831 kubelet[2924]: E0909 00:14:44.953772 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.953913 kubelet[2924]: E0909 00:14:44.953907 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.954001 kubelet[2924]: W0909 00:14:44.953941 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.954001 kubelet[2924]: E0909 00:14:44.953953 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.954102 kubelet[2924]: E0909 00:14:44.954093 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.954216 kubelet[2924]: W0909 00:14:44.954143 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.954216 kubelet[2924]: E0909 00:14:44.954155 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.954306 kubelet[2924]: E0909 00:14:44.954299 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.954539 kubelet[2924]: W0909 00:14:44.954336 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.954539 kubelet[2924]: E0909 00:14:44.954352 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.954539 kubelet[2924]: E0909 00:14:44.954482 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.954539 kubelet[2924]: W0909 00:14:44.954487 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.954539 kubelet[2924]: E0909 00:14:44.954493 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.954658 kubelet[2924]: E0909 00:14:44.954588 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.954658 kubelet[2924]: W0909 00:14:44.954592 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.954658 kubelet[2924]: E0909 00:14:44.954604 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.954723 kubelet[2924]: E0909 00:14:44.954687 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.954723 kubelet[2924]: W0909 00:14:44.954692 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.954723 kubelet[2924]: E0909 00:14:44.954705 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.954792 kubelet[2924]: E0909 00:14:44.954780 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.954792 kubelet[2924]: W0909 00:14:44.954789 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.954834 kubelet[2924]: E0909 00:14:44.954803 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.954908 kubelet[2924]: E0909 00:14:44.954897 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.954908 kubelet[2924]: W0909 00:14:44.954905 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.954974 kubelet[2924]: E0909 00:14:44.954914 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.955126 kubelet[2924]: E0909 00:14:44.955088 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.955126 kubelet[2924]: W0909 00:14:44.955095 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.955186 kubelet[2924]: E0909 00:14:44.955179 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.955228 kubelet[2924]: E0909 00:14:44.955201 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.955259 kubelet[2924]: W0909 00:14:44.955254 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.955295 kubelet[2924]: E0909 00:14:44.955290 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.955418 kubelet[2924]: E0909 00:14:44.955396 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.955418 kubelet[2924]: W0909 00:14:44.955402 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.955418 kubelet[2924]: E0909 00:14:44.955410 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.955493 kubelet[2924]: E0909 00:14:44.955481 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.955493 kubelet[2924]: W0909 00:14:44.955489 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.955536 kubelet[2924]: E0909 00:14:44.955497 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.955660 kubelet[2924]: E0909 00:14:44.955649 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.955660 kubelet[2924]: W0909 00:14:44.955658 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.955716 kubelet[2924]: E0909 00:14:44.955664 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.955757 kubelet[2924]: E0909 00:14:44.955747 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.955757 kubelet[2924]: W0909 00:14:44.955754 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.955793 kubelet[2924]: E0909 00:14:44.955759 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:44.955914 kubelet[2924]: E0909 00:14:44.955904 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 00:14:44.955914 kubelet[2924]: W0909 00:14:44.955912 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 00:14:44.955955 kubelet[2924]: E0909 00:14:44.955917 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 00:14:45.226545 containerd[1638]: time="2025-09-09T00:14:45.225928872Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:45.233399 containerd[1638]: time="2025-09-09T00:14:45.233371332Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 00:14:45.242101 containerd[1638]: time="2025-09-09T00:14:45.242065329Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:45.250451 containerd[1638]: time="2025-09-09T00:14:45.249994758Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:45.252703 containerd[1638]: time="2025-09-09T00:14:45.252646320Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.589500309s" Sep 9 00:14:45.252703 containerd[1638]: time="2025-09-09T00:14:45.252666431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 00:14:45.261559 containerd[1638]: time="2025-09-09T00:14:45.261523660Z" level=info msg="CreateContainer within sandbox \"5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 00:14:45.320992 containerd[1638]: time="2025-09-09T00:14:45.320328309Z" level=info msg="Container 2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:14:45.523420 containerd[1638]: time="2025-09-09T00:14:45.523392427Z" level=info msg="CreateContainer within sandbox \"5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37\"" Sep 9 00:14:45.524677 containerd[1638]: time="2025-09-09T00:14:45.524644024Z" level=info msg="StartContainer for \"2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37\"" Sep 9 00:14:45.525687 containerd[1638]: time="2025-09-09T00:14:45.525664891Z" level=info msg="connecting to shim 2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37" address="unix:///run/containerd/s/dd1eee1ab0f1cd232cf7c7dc576d74e286f8ae217287d60b4ef0d4b5ab096b45" protocol=ttrpc version=3 Sep 9 00:14:45.544238 systemd[1]: Started cri-containerd-2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37.scope - libcontainer container 2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37. Sep 9 00:14:45.583332 systemd[1]: cri-containerd-2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37.scope: Deactivated successfully. Sep 9 00:14:45.587108 containerd[1638]: time="2025-09-09T00:14:45.587085387Z" level=info msg="StartContainer for \"2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37\" returns successfully" Sep 9 00:14:45.642524 containerd[1638]: time="2025-09-09T00:14:45.642487083Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37\" id:\"2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37\" pid:3614 exited_at:{seconds:1757376885 nanos:584795840}" Sep 9 00:14:45.647665 containerd[1638]: time="2025-09-09T00:14:45.647634440Z" level=info msg="received exit event container_id:\"2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37\" id:\"2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37\" pid:3614 exited_at:{seconds:1757376885 nanos:584795840}" Sep 9 00:14:45.665126 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2eae943beac09a7d23ab9e6c519cddb321b4298c30ac65c12ee36e245c671a37-rootfs.mount: Deactivated successfully. Sep 9 00:14:45.947234 kubelet[2924]: I0909 00:14:45.947088 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-694c8bc46b-8nt59" podStartSLOduration=3.460636833 podStartE2EDuration="5.947076798s" podCreationTimestamp="2025-09-09 00:14:40 +0000 UTC" firstStartedPulling="2025-09-09 00:14:41.176425844 +0000 UTC m=+18.454596592" lastFinishedPulling="2025-09-09 00:14:43.662865804 +0000 UTC m=+20.941036557" observedRunningTime="2025-09-09 00:14:43.925856923 +0000 UTC m=+21.204027681" watchObservedRunningTime="2025-09-09 00:14:45.947076798 +0000 UTC m=+23.225247555" Sep 9 00:14:46.815289 kubelet[2924]: E0909 00:14:46.815245 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:46.910024 containerd[1638]: time="2025-09-09T00:14:46.909731009Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 00:14:48.814835 kubelet[2924]: E0909 00:14:48.814547 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:50.814519 kubelet[2924]: E0909 00:14:50.814477 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:52.048225 kubelet[2924]: I0909 00:14:52.047890 2924 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 00:14:52.121109 containerd[1638]: time="2025-09-09T00:14:52.121072463Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:52.172320 containerd[1638]: time="2025-09-09T00:14:52.172285358Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 00:14:52.216252 containerd[1638]: time="2025-09-09T00:14:52.216207214Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:52.244344 containerd[1638]: time="2025-09-09T00:14:52.244287738Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:14:52.245016 containerd[1638]: time="2025-09-09T00:14:52.244769951Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 5.335013366s" Sep 9 00:14:52.245016 containerd[1638]: time="2025-09-09T00:14:52.244791868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 00:14:52.259286 containerd[1638]: time="2025-09-09T00:14:52.259257697Z" level=info msg="CreateContainer within sandbox \"5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 00:14:52.459922 containerd[1638]: time="2025-09-09T00:14:52.459853819Z" level=info msg="Container 2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:14:52.462717 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1249524482.mount: Deactivated successfully. Sep 9 00:14:52.495554 containerd[1638]: time="2025-09-09T00:14:52.495521417Z" level=info msg="CreateContainer within sandbox \"5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347\"" Sep 9 00:14:52.496070 containerd[1638]: time="2025-09-09T00:14:52.496048995Z" level=info msg="StartContainer for \"2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347\"" Sep 9 00:14:52.497091 containerd[1638]: time="2025-09-09T00:14:52.497066136Z" level=info msg="connecting to shim 2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347" address="unix:///run/containerd/s/dd1eee1ab0f1cd232cf7c7dc576d74e286f8ae217287d60b4ef0d4b5ab096b45" protocol=ttrpc version=3 Sep 9 00:14:52.521322 systemd[1]: Started cri-containerd-2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347.scope - libcontainer container 2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347. Sep 9 00:14:52.586848 containerd[1638]: time="2025-09-09T00:14:52.586757621Z" level=info msg="StartContainer for \"2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347\" returns successfully" Sep 9 00:14:52.815382 kubelet[2924]: E0909 00:14:52.815187 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:54.814187 kubelet[2924]: E0909 00:14:54.813861 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:55.372556 systemd[1]: cri-containerd-2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347.scope: Deactivated successfully. Sep 9 00:14:55.373049 systemd[1]: cri-containerd-2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347.scope: Consumed 314ms CPU time, 164.7M memory peak, 1M read from disk, 171.3M written to disk. Sep 9 00:14:55.426025 containerd[1638]: time="2025-09-09T00:14:55.425991913Z" level=info msg="received exit event container_id:\"2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347\" id:\"2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347\" pid:3674 exited_at:{seconds:1757376895 nanos:425768259}" Sep 9 00:14:55.451113 containerd[1638]: time="2025-09-09T00:14:55.450947138Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347\" id:\"2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347\" pid:3674 exited_at:{seconds:1757376895 nanos:425768259}" Sep 9 00:14:55.476401 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2d3b9c896cb548ae680f173a486ece97c4d6b29b9c2eadda7cf504aad5650347-rootfs.mount: Deactivated successfully. Sep 9 00:14:55.497475 kubelet[2924]: I0909 00:14:55.497457 2924 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 00:14:55.762303 systemd[1]: Created slice kubepods-burstable-pod37daf8f3_9d85_49c3_ba4d_11feef4e10ab.slice - libcontainer container kubepods-burstable-pod37daf8f3_9d85_49c3_ba4d_11feef4e10ab.slice. Sep 9 00:14:55.767185 systemd[1]: Created slice kubepods-besteffort-pod9b624628_286c_47e3_a778_e3d720ed16a8.slice - libcontainer container kubepods-besteffort-pod9b624628_286c_47e3_a778_e3d720ed16a8.slice. Sep 9 00:14:55.770682 systemd[1]: Created slice kubepods-burstable-podb151ff8e_a88a_495c_a5aa_415fe52c4a10.slice - libcontainer container kubepods-burstable-podb151ff8e_a88a_495c_a5aa_415fe52c4a10.slice. Sep 9 00:14:55.774592 systemd[1]: Created slice kubepods-besteffort-pod466bc793_f266_46d4_b563_49ff42f7b1ae.slice - libcontainer container kubepods-besteffort-pod466bc793_f266_46d4_b563_49ff42f7b1ae.slice. Sep 9 00:14:55.799683 kubelet[2924]: W0909 00:14:55.799595 2924 reflector.go:569] object-"calico-system"/"goldmane": failed to list *v1.ConfigMap: configmaps "goldmane" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Sep 9 00:14:55.800063 kubelet[2924]: W0909 00:14:55.800004 2924 reflector.go:569] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Sep 9 00:14:55.803907 kubelet[2924]: E0909 00:14:55.803884 2924 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 9 00:14:55.804147 kubelet[2924]: E0909 00:14:55.803996 2924 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 9 00:14:55.804147 kubelet[2924]: W0909 00:14:55.804040 2924 reflector.go:569] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Sep 9 00:14:55.804207 kubelet[2924]: E0909 00:14:55.804158 2924 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 9 00:14:55.804207 kubelet[2924]: W0909 00:14:55.804174 2924 reflector.go:569] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Sep 9 00:14:55.804207 kubelet[2924]: E0909 00:14:55.804183 2924 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 9 00:14:55.804207 kubelet[2924]: W0909 00:14:55.804194 2924 reflector.go:569] object-"calico-system"/"goldmane-ca-bundle": failed to list *v1.ConfigMap: configmaps "goldmane-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Sep 9 00:14:55.804207 kubelet[2924]: E0909 00:14:55.804199 2924 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane-ca-bundle\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 9 00:14:55.804715 kubelet[2924]: W0909 00:14:55.804567 2924 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Sep 9 00:14:55.804715 kubelet[2924]: E0909 00:14:55.804580 2924 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 9 00:14:55.804715 kubelet[2924]: W0909 00:14:55.804625 2924 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Sep 9 00:14:55.804715 kubelet[2924]: E0909 00:14:55.804633 2924 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Sep 9 00:14:55.807332 systemd[1]: Created slice kubepods-besteffort-poddab4fb84_134e_402d_a321_1001c52517c9.slice - libcontainer container kubepods-besteffort-poddab4fb84_134e_402d_a321_1001c52517c9.slice. Sep 9 00:14:55.811175 systemd[1]: Created slice kubepods-besteffort-pod9319c330_61e7_4f2f_810d_926a85cebaee.slice - libcontainer container kubepods-besteffort-pod9319c330_61e7_4f2f_810d_926a85cebaee.slice. Sep 9 00:14:55.816296 systemd[1]: Created slice kubepods-besteffort-pod962c3eef_2bf8_48c2_ad46_689b3e56ddab.slice - libcontainer container kubepods-besteffort-pod962c3eef_2bf8_48c2_ad46_689b3e56ddab.slice. Sep 9 00:14:55.818439 kubelet[2924]: I0909 00:14:55.818414 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37daf8f3-9d85-49c3-ba4d-11feef4e10ab-config-volume\") pod \"coredns-668d6bf9bc-r6qzt\" (UID: \"37daf8f3-9d85-49c3-ba4d-11feef4e10ab\") " pod="kube-system/coredns-668d6bf9bc-r6qzt" Sep 9 00:14:55.818632 kubelet[2924]: I0909 00:14:55.818445 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6zk6\" (UniqueName: \"kubernetes.io/projected/37daf8f3-9d85-49c3-ba4d-11feef4e10ab-kube-api-access-h6zk6\") pod \"coredns-668d6bf9bc-r6qzt\" (UID: \"37daf8f3-9d85-49c3-ba4d-11feef4e10ab\") " pod="kube-system/coredns-668d6bf9bc-r6qzt" Sep 9 00:14:55.919760 kubelet[2924]: I0909 00:14:55.919488 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9319c330-61e7-4f2f-810d-926a85cebaee-calico-apiserver-certs\") pod \"calico-apiserver-7ff777577d-c77vc\" (UID: \"9319c330-61e7-4f2f-810d-926a85cebaee\") " pod="calico-apiserver/calico-apiserver-7ff777577d-c77vc" Sep 9 00:14:55.919760 kubelet[2924]: I0909 00:14:55.919514 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69bsh\" (UniqueName: \"kubernetes.io/projected/9319c330-61e7-4f2f-810d-926a85cebaee-kube-api-access-69bsh\") pod \"calico-apiserver-7ff777577d-c77vc\" (UID: \"9319c330-61e7-4f2f-810d-926a85cebaee\") " pod="calico-apiserver/calico-apiserver-7ff777577d-c77vc" Sep 9 00:14:55.919760 kubelet[2924]: I0909 00:14:55.919525 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcsg2\" (UniqueName: \"kubernetes.io/projected/dab4fb84-134e-402d-a321-1001c52517c9-kube-api-access-hcsg2\") pod \"goldmane-54d579b49d-bgldd\" (UID: \"dab4fb84-134e-402d-a321-1001c52517c9\") " pod="calico-system/goldmane-54d579b49d-bgldd" Sep 9 00:14:55.919760 kubelet[2924]: I0909 00:14:55.919548 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g7sr\" (UniqueName: \"kubernetes.io/projected/9b624628-286c-47e3-a778-e3d720ed16a8-kube-api-access-5g7sr\") pod \"calico-apiserver-7ff777577d-6mmgm\" (UID: \"9b624628-286c-47e3-a778-e3d720ed16a8\") " pod="calico-apiserver/calico-apiserver-7ff777577d-6mmgm" Sep 9 00:14:55.919760 kubelet[2924]: I0909 00:14:55.919561 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b151ff8e-a88a-495c-a5aa-415fe52c4a10-config-volume\") pod \"coredns-668d6bf9bc-pnv8f\" (UID: \"b151ff8e-a88a-495c-a5aa-415fe52c4a10\") " pod="kube-system/coredns-668d6bf9bc-pnv8f" Sep 9 00:14:55.925066 kubelet[2924]: I0909 00:14:55.919577 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab4fb84-134e-402d-a321-1001c52517c9-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-bgldd\" (UID: \"dab4fb84-134e-402d-a321-1001c52517c9\") " pod="calico-system/goldmane-54d579b49d-bgldd" Sep 9 00:14:55.925066 kubelet[2924]: I0909 00:14:55.920358 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/466bc793-f266-46d4-b563-49ff42f7b1ae-tigera-ca-bundle\") pod \"calico-kube-controllers-5d75874cd-5q9th\" (UID: \"466bc793-f266-46d4-b563-49ff42f7b1ae\") " pod="calico-system/calico-kube-controllers-5d75874cd-5q9th" Sep 9 00:14:55.927061 kubelet[2924]: I0909 00:14:55.925439 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-backend-key-pair\") pod \"whisker-595d9fcb59-jkhmg\" (UID: \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\") " pod="calico-system/whisker-595d9fcb59-jkhmg" Sep 9 00:14:55.927061 kubelet[2924]: I0909 00:14:55.925481 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-ca-bundle\") pod \"whisker-595d9fcb59-jkhmg\" (UID: \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\") " pod="calico-system/whisker-595d9fcb59-jkhmg" Sep 9 00:14:55.927061 kubelet[2924]: I0909 00:14:55.925504 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/dab4fb84-134e-402d-a321-1001c52517c9-goldmane-key-pair\") pod \"goldmane-54d579b49d-bgldd\" (UID: \"dab4fb84-134e-402d-a321-1001c52517c9\") " pod="calico-system/goldmane-54d579b49d-bgldd" Sep 9 00:14:55.927061 kubelet[2924]: I0909 00:14:55.925539 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9b624628-286c-47e3-a778-e3d720ed16a8-calico-apiserver-certs\") pod \"calico-apiserver-7ff777577d-6mmgm\" (UID: \"9b624628-286c-47e3-a778-e3d720ed16a8\") " pod="calico-apiserver/calico-apiserver-7ff777577d-6mmgm" Sep 9 00:14:55.927061 kubelet[2924]: I0909 00:14:55.925564 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbsgs\" (UniqueName: \"kubernetes.io/projected/b151ff8e-a88a-495c-a5aa-415fe52c4a10-kube-api-access-pbsgs\") pod \"coredns-668d6bf9bc-pnv8f\" (UID: \"b151ff8e-a88a-495c-a5aa-415fe52c4a10\") " pod="kube-system/coredns-668d6bf9bc-pnv8f" Sep 9 00:14:55.927353 kubelet[2924]: I0909 00:14:55.925580 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr5nb\" (UniqueName: \"kubernetes.io/projected/466bc793-f266-46d4-b563-49ff42f7b1ae-kube-api-access-lr5nb\") pod \"calico-kube-controllers-5d75874cd-5q9th\" (UID: \"466bc793-f266-46d4-b563-49ff42f7b1ae\") " pod="calico-system/calico-kube-controllers-5d75874cd-5q9th" Sep 9 00:14:55.927353 kubelet[2924]: I0909 00:14:55.925605 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab4fb84-134e-402d-a321-1001c52517c9-config\") pod \"goldmane-54d579b49d-bgldd\" (UID: \"dab4fb84-134e-402d-a321-1001c52517c9\") " pod="calico-system/goldmane-54d579b49d-bgldd" Sep 9 00:14:55.927353 kubelet[2924]: I0909 00:14:55.925617 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzgn\" (UniqueName: \"kubernetes.io/projected/962c3eef-2bf8-48c2-ad46-689b3e56ddab-kube-api-access-mgzgn\") pod \"whisker-595d9fcb59-jkhmg\" (UID: \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\") " pod="calico-system/whisker-595d9fcb59-jkhmg" Sep 9 00:14:56.024516 containerd[1638]: time="2025-09-09T00:14:56.024350313Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 00:14:56.077832 containerd[1638]: time="2025-09-09T00:14:56.077567907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-r6qzt,Uid:37daf8f3-9d85-49c3-ba4d-11feef4e10ab,Namespace:kube-system,Attempt:0,}" Sep 9 00:14:56.101755 containerd[1638]: time="2025-09-09T00:14:56.101730718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d75874cd-5q9th,Uid:466bc793-f266-46d4-b563-49ff42f7b1ae,Namespace:calico-system,Attempt:0,}" Sep 9 00:14:56.373001 containerd[1638]: time="2025-09-09T00:14:56.372761084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-pnv8f,Uid:b151ff8e-a88a-495c-a5aa-415fe52c4a10,Namespace:kube-system,Attempt:0,}" Sep 9 00:14:56.826861 systemd[1]: Created slice kubepods-besteffort-pod2778c8f3_9140_419e_8f7e_95ecd474a55a.slice - libcontainer container kubepods-besteffort-pod2778c8f3_9140_419e_8f7e_95ecd474a55a.slice. Sep 9 00:14:56.837509 containerd[1638]: time="2025-09-09T00:14:56.828569226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j6h4t,Uid:2778c8f3-9140-419e-8f7e-95ecd474a55a,Namespace:calico-system,Attempt:0,}" Sep 9 00:14:57.027088 kubelet[2924]: E0909 00:14:57.026950 2924 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.027088 kubelet[2924]: E0909 00:14:57.026996 2924 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-ca-bundle podName:962c3eef-2bf8-48c2-ad46-689b3e56ddab nodeName:}" failed. No retries permitted until 2025-09-09 00:14:57.526983321 +0000 UTC m=+34.805154072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-ca-bundle") pod "whisker-595d9fcb59-jkhmg" (UID: "962c3eef-2bf8-48c2-ad46-689b3e56ddab") : failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.027509 kubelet[2924]: E0909 00:14:57.027424 2924 configmap.go:193] Couldn't get configMap calico-system/goldmane-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.027509 kubelet[2924]: E0909 00:14:57.027448 2924 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dab4fb84-134e-402d-a321-1001c52517c9-goldmane-ca-bundle podName:dab4fb84-134e-402d-a321-1001c52517c9 nodeName:}" failed. No retries permitted until 2025-09-09 00:14:57.527441744 +0000 UTC m=+34.805612495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-ca-bundle" (UniqueName: "kubernetes.io/configmap/dab4fb84-134e-402d-a321-1001c52517c9-goldmane-ca-bundle") pod "goldmane-54d579b49d-bgldd" (UID: "dab4fb84-134e-402d-a321-1001c52517c9") : failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.031395 kubelet[2924]: E0909 00:14:57.031304 2924 secret.go:189] Couldn't get secret calico-system/goldmane-key-pair: failed to sync secret cache: timed out waiting for the condition Sep 9 00:14:57.031395 kubelet[2924]: E0909 00:14:57.031331 2924 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab4fb84-134e-402d-a321-1001c52517c9-goldmane-key-pair podName:dab4fb84-134e-402d-a321-1001c52517c9 nodeName:}" failed. No retries permitted until 2025-09-09 00:14:57.531324481 +0000 UTC m=+34.809495231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-key-pair" (UniqueName: "kubernetes.io/secret/dab4fb84-134e-402d-a321-1001c52517c9-goldmane-key-pair") pod "goldmane-54d579b49d-bgldd" (UID: "dab4fb84-134e-402d-a321-1001c52517c9") : failed to sync secret cache: timed out waiting for the condition Sep 9 00:14:57.031395 kubelet[2924]: E0909 00:14:57.031342 2924 configmap.go:193] Couldn't get configMap calico-system/goldmane: failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.031395 kubelet[2924]: E0909 00:14:57.031357 2924 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dab4fb84-134e-402d-a321-1001c52517c9-config podName:dab4fb84-134e-402d-a321-1001c52517c9 nodeName:}" failed. No retries permitted until 2025-09-09 00:14:57.531352458 +0000 UTC m=+34.809523208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dab4fb84-134e-402d-a321-1001c52517c9-config") pod "goldmane-54d579b49d-bgldd" (UID: "dab4fb84-134e-402d-a321-1001c52517c9") : failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.075623 kubelet[2924]: E0909 00:14:57.075601 2924 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.075623 kubelet[2924]: E0909 00:14:57.075622 2924 projected.go:194] Error preparing data for projected volume kube-api-access-5g7sr for pod calico-apiserver/calico-apiserver-7ff777577d-6mmgm: failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.075730 kubelet[2924]: E0909 00:14:57.075657 2924 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b624628-286c-47e3-a778-e3d720ed16a8-kube-api-access-5g7sr podName:9b624628-286c-47e3-a778-e3d720ed16a8 nodeName:}" failed. No retries permitted until 2025-09-09 00:14:57.57564685 +0000 UTC m=+34.853817597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5g7sr" (UniqueName: "kubernetes.io/projected/9b624628-286c-47e3-a778-e3d720ed16a8-kube-api-access-5g7sr") pod "calico-apiserver-7ff777577d-6mmgm" (UID: "9b624628-286c-47e3-a778-e3d720ed16a8") : failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.079680 kubelet[2924]: E0909 00:14:57.077734 2924 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.079680 kubelet[2924]: E0909 00:14:57.077747 2924 projected.go:194] Error preparing data for projected volume kube-api-access-69bsh for pod calico-apiserver/calico-apiserver-7ff777577d-c77vc: failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.079680 kubelet[2924]: E0909 00:14:57.077767 2924 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9319c330-61e7-4f2f-810d-926a85cebaee-kube-api-access-69bsh podName:9319c330-61e7-4f2f-810d-926a85cebaee nodeName:}" failed. No retries permitted until 2025-09-09 00:14:57.577759323 +0000 UTC m=+34.855930068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-69bsh" (UniqueName: "kubernetes.io/projected/9319c330-61e7-4f2f-810d-926a85cebaee-kube-api-access-69bsh") pod "calico-apiserver-7ff777577d-c77vc" (UID: "9319c330-61e7-4f2f-810d-926a85cebaee") : failed to sync configmap cache: timed out waiting for the condition Sep 9 00:14:57.528204 containerd[1638]: time="2025-09-09T00:14:57.528138121Z" level=error msg="Failed to destroy network for sandbox \"20bb2c45a0d2a1af343a065eb9d997e946130ab1061bc0fd1c22d698e33de88e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.528204 containerd[1638]: time="2025-09-09T00:14:57.528870835Z" level=error msg="Failed to destroy network for sandbox \"5cb2095bee73579cd3f53c9ee23f25ecae7eb47eb7baf2a8bb15f2738095642a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.529793 systemd[1]: run-netns-cni\x2d3cc899d5\x2d04b4\x2d0a04\x2d5862\x2d655da6967f29.mount: Deactivated successfully. Sep 9 00:14:57.533693 systemd[1]: run-netns-cni\x2dd2bea90f\x2d1045\x2d8c28\x2d81f5\x2d9895f67b0d0e.mount: Deactivated successfully. Sep 9 00:14:57.534898 containerd[1638]: time="2025-09-09T00:14:57.534171636Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-pnv8f,Uid:b151ff8e-a88a-495c-a5aa-415fe52c4a10,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"20bb2c45a0d2a1af343a065eb9d997e946130ab1061bc0fd1c22d698e33de88e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.535460 containerd[1638]: time="2025-09-09T00:14:57.535354928Z" level=error msg="Failed to destroy network for sandbox \"ebaf02baf465abbcfa18916cf76442b9716e90f3d4fc454927dc650583ccc49b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.538511 systemd[1]: run-netns-cni\x2de431a3dd\x2dffb6\x2d5843\x2db85f\x2d1174ca4480b8.mount: Deactivated successfully. Sep 9 00:14:57.540265 containerd[1638]: time="2025-09-09T00:14:57.540230657Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-r6qzt,Uid:37daf8f3-9d85-49c3-ba4d-11feef4e10ab,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cb2095bee73579cd3f53c9ee23f25ecae7eb47eb7baf2a8bb15f2738095642a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.542515 containerd[1638]: time="2025-09-09T00:14:57.542361641Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j6h4t,Uid:2778c8f3-9140-419e-8f7e-95ecd474a55a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebaf02baf465abbcfa18916cf76442b9716e90f3d4fc454927dc650583ccc49b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.543860 containerd[1638]: time="2025-09-09T00:14:57.543764579Z" level=error msg="Failed to destroy network for sandbox \"36f96c2e57ba80cbbd7c28f288690c35688578382f80f715432c61e290367b7f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.545320 systemd[1]: run-netns-cni\x2dd8393fca\x2d073d\x2d88e8\x2df0c8\x2db92b7e69ce56.mount: Deactivated successfully. Sep 9 00:14:57.547259 containerd[1638]: time="2025-09-09T00:14:57.547234841Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d75874cd-5q9th,Uid:466bc793-f266-46d4-b563-49ff42f7b1ae,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"36f96c2e57ba80cbbd7c28f288690c35688578382f80f715432c61e290367b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.549299 kubelet[2924]: E0909 00:14:57.548723 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20bb2c45a0d2a1af343a065eb9d997e946130ab1061bc0fd1c22d698e33de88e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.551286 kubelet[2924]: E0909 00:14:57.550947 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20bb2c45a0d2a1af343a065eb9d997e946130ab1061bc0fd1c22d698e33de88e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-pnv8f" Sep 9 00:14:57.553315 kubelet[2924]: E0909 00:14:57.553298 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20bb2c45a0d2a1af343a065eb9d997e946130ab1061bc0fd1c22d698e33de88e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-pnv8f" Sep 9 00:14:57.554878 kubelet[2924]: E0909 00:14:57.554855 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-pnv8f_kube-system(b151ff8e-a88a-495c-a5aa-415fe52c4a10)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-pnv8f_kube-system(b151ff8e-a88a-495c-a5aa-415fe52c4a10)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"20bb2c45a0d2a1af343a065eb9d997e946130ab1061bc0fd1c22d698e33de88e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-pnv8f" podUID="b151ff8e-a88a-495c-a5aa-415fe52c4a10" Sep 9 00:14:57.562015 kubelet[2924]: E0909 00:14:57.560970 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36f96c2e57ba80cbbd7c28f288690c35688578382f80f715432c61e290367b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.562015 kubelet[2924]: E0909 00:14:57.561018 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36f96c2e57ba80cbbd7c28f288690c35688578382f80f715432c61e290367b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d75874cd-5q9th" Sep 9 00:14:57.562015 kubelet[2924]: E0909 00:14:57.561036 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36f96c2e57ba80cbbd7c28f288690c35688578382f80f715432c61e290367b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d75874cd-5q9th" Sep 9 00:14:57.562163 kubelet[2924]: E0909 00:14:57.561063 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5d75874cd-5q9th_calico-system(466bc793-f266-46d4-b563-49ff42f7b1ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5d75874cd-5q9th_calico-system(466bc793-f266-46d4-b563-49ff42f7b1ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"36f96c2e57ba80cbbd7c28f288690c35688578382f80f715432c61e290367b7f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5d75874cd-5q9th" podUID="466bc793-f266-46d4-b563-49ff42f7b1ae" Sep 9 00:14:57.562163 kubelet[2924]: E0909 00:14:57.561095 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cb2095bee73579cd3f53c9ee23f25ecae7eb47eb7baf2a8bb15f2738095642a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.562163 kubelet[2924]: E0909 00:14:57.561107 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cb2095bee73579cd3f53c9ee23f25ecae7eb47eb7baf2a8bb15f2738095642a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-r6qzt" Sep 9 00:14:57.562253 kubelet[2924]: E0909 00:14:57.561154 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cb2095bee73579cd3f53c9ee23f25ecae7eb47eb7baf2a8bb15f2738095642a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-r6qzt" Sep 9 00:14:57.562253 kubelet[2924]: E0909 00:14:57.561178 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-r6qzt_kube-system(37daf8f3-9d85-49c3-ba4d-11feef4e10ab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-r6qzt_kube-system(37daf8f3-9d85-49c3-ba4d-11feef4e10ab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5cb2095bee73579cd3f53c9ee23f25ecae7eb47eb7baf2a8bb15f2738095642a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-r6qzt" podUID="37daf8f3-9d85-49c3-ba4d-11feef4e10ab" Sep 9 00:14:57.562253 kubelet[2924]: E0909 00:14:57.561455 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebaf02baf465abbcfa18916cf76442b9716e90f3d4fc454927dc650583ccc49b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.562323 kubelet[2924]: E0909 00:14:57.561470 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebaf02baf465abbcfa18916cf76442b9716e90f3d4fc454927dc650583ccc49b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-j6h4t" Sep 9 00:14:57.562323 kubelet[2924]: E0909 00:14:57.561479 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebaf02baf465abbcfa18916cf76442b9716e90f3d4fc454927dc650583ccc49b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-j6h4t" Sep 9 00:14:57.562323 kubelet[2924]: E0909 00:14:57.561500 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-j6h4t_calico-system(2778c8f3-9140-419e-8f7e-95ecd474a55a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-j6h4t_calico-system(2778c8f3-9140-419e-8f7e-95ecd474a55a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ebaf02baf465abbcfa18916cf76442b9716e90f3d4fc454927dc650583ccc49b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-j6h4t" podUID="2778c8f3-9140-419e-8f7e-95ecd474a55a" Sep 9 00:14:57.617691 containerd[1638]: time="2025-09-09T00:14:57.617668466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bgldd,Uid:dab4fb84-134e-402d-a321-1001c52517c9,Namespace:calico-system,Attempt:0,}" Sep 9 00:14:57.619243 containerd[1638]: time="2025-09-09T00:14:57.619215016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-595d9fcb59-jkhmg,Uid:962c3eef-2bf8-48c2-ad46-689b3e56ddab,Namespace:calico-system,Attempt:0,}" Sep 9 00:14:57.675822 containerd[1638]: time="2025-09-09T00:14:57.675755408Z" level=error msg="Failed to destroy network for sandbox \"8e8b690abb210d8f228177bbb1ecaae2ac95acee844d7efd4677e08fedd73bf9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.676092 containerd[1638]: time="2025-09-09T00:14:57.676067629Z" level=error msg="Failed to destroy network for sandbox \"2120c3fc4fb974898523ad4b32ac6255247cf253bcef735c1e33473eb3a01b25\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.676636 containerd[1638]: time="2025-09-09T00:14:57.676616834Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bgldd,Uid:dab4fb84-134e-402d-a321-1001c52517c9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2120c3fc4fb974898523ad4b32ac6255247cf253bcef735c1e33473eb3a01b25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.676857 kubelet[2924]: E0909 00:14:57.676809 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2120c3fc4fb974898523ad4b32ac6255247cf253bcef735c1e33473eb3a01b25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.676900 kubelet[2924]: E0909 00:14:57.676875 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2120c3fc4fb974898523ad4b32ac6255247cf253bcef735c1e33473eb3a01b25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-bgldd" Sep 9 00:14:57.676900 kubelet[2924]: E0909 00:14:57.676891 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2120c3fc4fb974898523ad4b32ac6255247cf253bcef735c1e33473eb3a01b25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-bgldd" Sep 9 00:14:57.676956 kubelet[2924]: E0909 00:14:57.676919 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-bgldd_calico-system(dab4fb84-134e-402d-a321-1001c52517c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-bgldd_calico-system(dab4fb84-134e-402d-a321-1001c52517c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2120c3fc4fb974898523ad4b32ac6255247cf253bcef735c1e33473eb3a01b25\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-bgldd" podUID="dab4fb84-134e-402d-a321-1001c52517c9" Sep 9 00:14:57.677396 containerd[1638]: time="2025-09-09T00:14:57.677352472Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-595d9fcb59-jkhmg,Uid:962c3eef-2bf8-48c2-ad46-689b3e56ddab,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e8b690abb210d8f228177bbb1ecaae2ac95acee844d7efd4677e08fedd73bf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.677538 kubelet[2924]: E0909 00:14:57.677467 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e8b690abb210d8f228177bbb1ecaae2ac95acee844d7efd4677e08fedd73bf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.677538 kubelet[2924]: E0909 00:14:57.677486 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e8b690abb210d8f228177bbb1ecaae2ac95acee844d7efd4677e08fedd73bf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-595d9fcb59-jkhmg" Sep 9 00:14:57.677759 kubelet[2924]: E0909 00:14:57.677640 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e8b690abb210d8f228177bbb1ecaae2ac95acee844d7efd4677e08fedd73bf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-595d9fcb59-jkhmg" Sep 9 00:14:57.677759 kubelet[2924]: E0909 00:14:57.677681 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-595d9fcb59-jkhmg_calico-system(962c3eef-2bf8-48c2-ad46-689b3e56ddab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-595d9fcb59-jkhmg_calico-system(962c3eef-2bf8-48c2-ad46-689b3e56ddab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e8b690abb210d8f228177bbb1ecaae2ac95acee844d7efd4677e08fedd73bf9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-595d9fcb59-jkhmg" podUID="962c3eef-2bf8-48c2-ad46-689b3e56ddab" Sep 9 00:14:57.869414 containerd[1638]: time="2025-09-09T00:14:57.869343165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ff777577d-6mmgm,Uid:9b624628-286c-47e3-a778-e3d720ed16a8,Namespace:calico-apiserver,Attempt:0,}" Sep 9 00:14:57.902971 containerd[1638]: time="2025-09-09T00:14:57.902862494Z" level=error msg="Failed to destroy network for sandbox \"a29519e5e6452323e8c7faf6567572f9d00550ee63a9f746c52810b30e17dfdd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.904183 containerd[1638]: time="2025-09-09T00:14:57.904159333Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ff777577d-6mmgm,Uid:9b624628-286c-47e3-a778-e3d720ed16a8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a29519e5e6452323e8c7faf6567572f9d00550ee63a9f746c52810b30e17dfdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.904441 kubelet[2924]: E0909 00:14:57.904422 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a29519e5e6452323e8c7faf6567572f9d00550ee63a9f746c52810b30e17dfdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.904527 kubelet[2924]: E0909 00:14:57.904517 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a29519e5e6452323e8c7faf6567572f9d00550ee63a9f746c52810b30e17dfdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7ff777577d-6mmgm" Sep 9 00:14:57.904593 kubelet[2924]: E0909 00:14:57.904576 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a29519e5e6452323e8c7faf6567572f9d00550ee63a9f746c52810b30e17dfdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7ff777577d-6mmgm" Sep 9 00:14:57.905167 kubelet[2924]: E0909 00:14:57.904667 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7ff777577d-6mmgm_calico-apiserver(9b624628-286c-47e3-a778-e3d720ed16a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7ff777577d-6mmgm_calico-apiserver(9b624628-286c-47e3-a778-e3d720ed16a8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a29519e5e6452323e8c7faf6567572f9d00550ee63a9f746c52810b30e17dfdd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7ff777577d-6mmgm" podUID="9b624628-286c-47e3-a778-e3d720ed16a8" Sep 9 00:14:57.915230 containerd[1638]: time="2025-09-09T00:14:57.915194882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ff777577d-c77vc,Uid:9319c330-61e7-4f2f-810d-926a85cebaee,Namespace:calico-apiserver,Attempt:0,}" Sep 9 00:14:57.960476 containerd[1638]: time="2025-09-09T00:14:57.960394940Z" level=error msg="Failed to destroy network for sandbox \"7dc601d33768a6f44ab60bb920f0873c02213e93f6d5ae59004baf6af5fd6e70\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.961202 containerd[1638]: time="2025-09-09T00:14:57.961142297Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ff777577d-c77vc,Uid:9319c330-61e7-4f2f-810d-926a85cebaee,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dc601d33768a6f44ab60bb920f0873c02213e93f6d5ae59004baf6af5fd6e70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.961319 kubelet[2924]: E0909 00:14:57.961272 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dc601d33768a6f44ab60bb920f0873c02213e93f6d5ae59004baf6af5fd6e70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 00:14:57.961319 kubelet[2924]: E0909 00:14:57.961311 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dc601d33768a6f44ab60bb920f0873c02213e93f6d5ae59004baf6af5fd6e70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7ff777577d-c77vc" Sep 9 00:14:57.961453 kubelet[2924]: E0909 00:14:57.961332 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dc601d33768a6f44ab60bb920f0873c02213e93f6d5ae59004baf6af5fd6e70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7ff777577d-c77vc" Sep 9 00:14:57.961453 kubelet[2924]: E0909 00:14:57.961362 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7ff777577d-c77vc_calico-apiserver(9319c330-61e7-4f2f-810d-926a85cebaee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7ff777577d-c77vc_calico-apiserver(9319c330-61e7-4f2f-810d-926a85cebaee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7dc601d33768a6f44ab60bb920f0873c02213e93f6d5ae59004baf6af5fd6e70\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7ff777577d-c77vc" podUID="9319c330-61e7-4f2f-810d-926a85cebaee" Sep 9 00:15:02.591465 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount791665583.mount: Deactivated successfully. Sep 9 00:15:02.770096 containerd[1638]: time="2025-09-09T00:15:02.733212449Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 00:15:02.781026 containerd[1638]: time="2025-09-09T00:15:02.780861581Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:02.781026 containerd[1638]: time="2025-09-09T00:15:02.780935589Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.755920468s" Sep 9 00:15:02.781026 containerd[1638]: time="2025-09-09T00:15:02.780971963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 00:15:02.789619 containerd[1638]: time="2025-09-09T00:15:02.789596851Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:02.791247 containerd[1638]: time="2025-09-09T00:15:02.791231715Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:02.802585 containerd[1638]: time="2025-09-09T00:15:02.802527698Z" level=info msg="CreateContainer within sandbox \"5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 00:15:02.852735 containerd[1638]: time="2025-09-09T00:15:02.851220026Z" level=info msg="Container ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:02.855125 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1948503192.mount: Deactivated successfully. Sep 9 00:15:02.890335 containerd[1638]: time="2025-09-09T00:15:02.890304067Z" level=info msg="CreateContainer within sandbox \"5900bada00b7e1b4e65700bfcbf620a4cf384cbb50a9e049fc8fe99f3bab0815\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb\"" Sep 9 00:15:02.891484 containerd[1638]: time="2025-09-09T00:15:02.891428295Z" level=info msg="StartContainer for \"ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb\"" Sep 9 00:15:02.898456 containerd[1638]: time="2025-09-09T00:15:02.898439665Z" level=info msg="connecting to shim ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb" address="unix:///run/containerd/s/dd1eee1ab0f1cd232cf7c7dc576d74e286f8ae217287d60b4ef0d4b5ab096b45" protocol=ttrpc version=3 Sep 9 00:15:03.012207 systemd[1]: Started cri-containerd-ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb.scope - libcontainer container ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb. Sep 9 00:15:03.054917 containerd[1638]: time="2025-09-09T00:15:03.054893301Z" level=info msg="StartContainer for \"ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb\" returns successfully" Sep 9 00:15:03.208775 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 00:15:03.210961 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 00:15:03.990912 kubelet[2924]: I0909 00:15:03.990819 2924 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgzgn\" (UniqueName: \"kubernetes.io/projected/962c3eef-2bf8-48c2-ad46-689b3e56ddab-kube-api-access-mgzgn\") pod \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\" (UID: \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\") " Sep 9 00:15:03.990912 kubelet[2924]: I0909 00:15:03.990865 2924 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-backend-key-pair\") pod \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\" (UID: \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\") " Sep 9 00:15:03.990912 kubelet[2924]: I0909 00:15:03.990889 2924 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-ca-bundle\") pod \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\" (UID: \"962c3eef-2bf8-48c2-ad46-689b3e56ddab\") " Sep 9 00:15:03.991512 kubelet[2924]: I0909 00:15:03.991236 2924 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "962c3eef-2bf8-48c2-ad46-689b3e56ddab" (UID: "962c3eef-2bf8-48c2-ad46-689b3e56ddab"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 00:15:04.003183 systemd[1]: var-lib-kubelet-pods-962c3eef\x2d2bf8\x2d48c2\x2dad46\x2d689b3e56ddab-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 00:15:04.003566 systemd[1]: var-lib-kubelet-pods-962c3eef\x2d2bf8\x2d48c2\x2dad46\x2d689b3e56ddab-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmgzgn.mount: Deactivated successfully. Sep 9 00:15:04.006301 kubelet[2924]: I0909 00:15:04.003451 2924 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "962c3eef-2bf8-48c2-ad46-689b3e56ddab" (UID: "962c3eef-2bf8-48c2-ad46-689b3e56ddab"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 00:15:04.006496 kubelet[2924]: I0909 00:15:04.006475 2924 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962c3eef-2bf8-48c2-ad46-689b3e56ddab-kube-api-access-mgzgn" (OuterVolumeSpecName: "kube-api-access-mgzgn") pod "962c3eef-2bf8-48c2-ad46-689b3e56ddab" (UID: "962c3eef-2bf8-48c2-ad46-689b3e56ddab"). InnerVolumeSpecName "kube-api-access-mgzgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 00:15:04.063291 systemd[1]: Removed slice kubepods-besteffort-pod962c3eef_2bf8_48c2_ad46_689b3e56ddab.slice - libcontainer container kubepods-besteffort-pod962c3eef_2bf8_48c2_ad46_689b3e56ddab.slice. Sep 9 00:15:04.092141 kubelet[2924]: I0909 00:15:04.091315 2924 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgzgn\" (UniqueName: \"kubernetes.io/projected/962c3eef-2bf8-48c2-ad46-689b3e56ddab-kube-api-access-mgzgn\") on node \"localhost\" DevicePath \"\"" Sep 9 00:15:04.092141 kubelet[2924]: I0909 00:15:04.091332 2924 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 00:15:04.092141 kubelet[2924]: I0909 00:15:04.091338 2924 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/962c3eef-2bf8-48c2-ad46-689b3e56ddab-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 00:15:04.093047 kubelet[2924]: I0909 00:15:04.092491 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-8nxsr" podStartSLOduration=2.78836265 podStartE2EDuration="24.092481264s" podCreationTimestamp="2025-09-09 00:14:40 +0000 UTC" firstStartedPulling="2025-09-09 00:14:41.485688238 +0000 UTC m=+18.763858989" lastFinishedPulling="2025-09-09 00:15:02.789806858 +0000 UTC m=+40.067977603" observedRunningTime="2025-09-09 00:15:04.07789087 +0000 UTC m=+41.356061628" watchObservedRunningTime="2025-09-09 00:15:04.092481264 +0000 UTC m=+41.370652016" Sep 9 00:15:04.174152 systemd[1]: Created slice kubepods-besteffort-pod237c7fbc_2d46_4bec_895d_6628ad1eb857.slice - libcontainer container kubepods-besteffort-pod237c7fbc_2d46_4bec_895d_6628ad1eb857.slice. Sep 9 00:15:04.301681 kubelet[2924]: I0909 00:15:04.301647 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/237c7fbc-2d46-4bec-895d-6628ad1eb857-whisker-backend-key-pair\") pod \"whisker-5dd699984b-dm7v6\" (UID: \"237c7fbc-2d46-4bec-895d-6628ad1eb857\") " pod="calico-system/whisker-5dd699984b-dm7v6" Sep 9 00:15:04.301681 kubelet[2924]: I0909 00:15:04.301681 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/237c7fbc-2d46-4bec-895d-6628ad1eb857-whisker-ca-bundle\") pod \"whisker-5dd699984b-dm7v6\" (UID: \"237c7fbc-2d46-4bec-895d-6628ad1eb857\") " pod="calico-system/whisker-5dd699984b-dm7v6" Sep 9 00:15:04.301829 kubelet[2924]: I0909 00:15:04.301697 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8s5\" (UniqueName: \"kubernetes.io/projected/237c7fbc-2d46-4bec-895d-6628ad1eb857-kube-api-access-mw8s5\") pod \"whisker-5dd699984b-dm7v6\" (UID: \"237c7fbc-2d46-4bec-895d-6628ad1eb857\") " pod="calico-system/whisker-5dd699984b-dm7v6" Sep 9 00:15:04.346918 containerd[1638]: time="2025-09-09T00:15:04.346891815Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb\" id:\"0e35aa48a5e797fcdd29a47c31788b37ab8d5c8ed66f615df9e2daffd844cf75\" pid:4006 exit_status:1 exited_at:{seconds:1757376904 nanos:346668941}" Sep 9 00:15:04.477375 containerd[1638]: time="2025-09-09T00:15:04.477281847Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dd699984b-dm7v6,Uid:237c7fbc-2d46-4bec-895d-6628ad1eb857,Namespace:calico-system,Attempt:0,}" Sep 9 00:15:04.816352 kubelet[2924]: I0909 00:15:04.816239 2924 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962c3eef-2bf8-48c2-ad46-689b3e56ddab" path="/var/lib/kubelet/pods/962c3eef-2bf8-48c2-ad46-689b3e56ddab/volumes" Sep 9 00:15:04.995376 systemd-networkd[1517]: calif9095c4b42c: Link UP Sep 9 00:15:04.995980 systemd-networkd[1517]: calif9095c4b42c: Gained carrier Sep 9 00:15:05.005343 containerd[1638]: 2025-09-09 00:15:04.505 [INFO][4029] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 00:15:05.005343 containerd[1638]: 2025-09-09 00:15:04.623 [INFO][4029] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5dd699984b--dm7v6-eth0 whisker-5dd699984b- calico-system 237c7fbc-2d46-4bec-895d-6628ad1eb857 878 0 2025-09-09 00:15:04 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5dd699984b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5dd699984b-dm7v6 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif9095c4b42c [] [] }} ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Namespace="calico-system" Pod="whisker-5dd699984b-dm7v6" WorkloadEndpoint="localhost-k8s-whisker--5dd699984b--dm7v6-" Sep 9 00:15:05.005343 containerd[1638]: 2025-09-09 00:15:04.623 [INFO][4029] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Namespace="calico-system" Pod="whisker-5dd699984b-dm7v6" WorkloadEndpoint="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" Sep 9 00:15:05.005343 containerd[1638]: 2025-09-09 00:15:04.927 [INFO][4037] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" HandleID="k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Workload="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.930 [INFO][4037] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" HandleID="k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Workload="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fd60), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5dd699984b-dm7v6", "timestamp":"2025-09-09 00:15:04.927586659 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.930 [INFO][4037] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.931 [INFO][4037] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.932 [INFO][4037] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.952 [INFO][4037] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" host="localhost" Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.965 [INFO][4037] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.968 [INFO][4037] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.969 [INFO][4037] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.970 [INFO][4037] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:05.005513 containerd[1638]: 2025-09-09 00:15:04.970 [INFO][4037] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" host="localhost" Sep 9 00:15:05.005718 containerd[1638]: 2025-09-09 00:15:04.971 [INFO][4037] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31 Sep 9 00:15:05.005718 containerd[1638]: 2025-09-09 00:15:04.974 [INFO][4037] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" host="localhost" Sep 9 00:15:05.005718 containerd[1638]: 2025-09-09 00:15:04.977 [INFO][4037] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" host="localhost" Sep 9 00:15:05.005718 containerd[1638]: 2025-09-09 00:15:04.977 [INFO][4037] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" host="localhost" Sep 9 00:15:05.005718 containerd[1638]: 2025-09-09 00:15:04.977 [INFO][4037] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 00:15:05.005718 containerd[1638]: 2025-09-09 00:15:04.977 [INFO][4037] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" HandleID="k8s-pod-network.bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Workload="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" Sep 9 00:15:05.005809 containerd[1638]: 2025-09-09 00:15:04.978 [INFO][4029] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Namespace="calico-system" Pod="whisker-5dd699984b-dm7v6" WorkloadEndpoint="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5dd699984b--dm7v6-eth0", GenerateName:"whisker-5dd699984b-", Namespace:"calico-system", SelfLink:"", UID:"237c7fbc-2d46-4bec-895d-6628ad1eb857", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 15, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5dd699984b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5dd699984b-dm7v6", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif9095c4b42c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:05.005809 containerd[1638]: 2025-09-09 00:15:04.979 [INFO][4029] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Namespace="calico-system" Pod="whisker-5dd699984b-dm7v6" WorkloadEndpoint="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" Sep 9 00:15:05.005874 containerd[1638]: 2025-09-09 00:15:04.979 [INFO][4029] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9095c4b42c ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Namespace="calico-system" Pod="whisker-5dd699984b-dm7v6" WorkloadEndpoint="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" Sep 9 00:15:05.005874 containerd[1638]: 2025-09-09 00:15:04.994 [INFO][4029] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Namespace="calico-system" Pod="whisker-5dd699984b-dm7v6" WorkloadEndpoint="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" Sep 9 00:15:05.005908 containerd[1638]: 2025-09-09 00:15:04.995 [INFO][4029] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Namespace="calico-system" Pod="whisker-5dd699984b-dm7v6" WorkloadEndpoint="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5dd699984b--dm7v6-eth0", GenerateName:"whisker-5dd699984b-", Namespace:"calico-system", SelfLink:"", UID:"237c7fbc-2d46-4bec-895d-6628ad1eb857", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 15, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5dd699984b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31", Pod:"whisker-5dd699984b-dm7v6", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif9095c4b42c", MAC:"12:20:9e:52:bf:05", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:05.005946 containerd[1638]: 2025-09-09 00:15:05.001 [INFO][4029] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" Namespace="calico-system" Pod="whisker-5dd699984b-dm7v6" WorkloadEndpoint="localhost-k8s-whisker--5dd699984b--dm7v6-eth0" Sep 9 00:15:05.163655 containerd[1638]: time="2025-09-09T00:15:05.163411367Z" level=info msg="connecting to shim bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31" address="unix:///run/containerd/s/eeb556bc9f0d641cf5c4d3bac2cd5209f3dee61c92a56f347295ea68dac72d42" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:15:05.189446 containerd[1638]: time="2025-09-09T00:15:05.189370667Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb\" id:\"7d2af8f0bfe57919327eab471a02d6ec9268251251979350b57b9fcc3d914cbe\" pid:4184 exit_status:1 exited_at:{seconds:1757376905 nanos:189197098}" Sep 9 00:15:05.202240 systemd[1]: Started cri-containerd-bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31.scope - libcontainer container bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31. Sep 9 00:15:05.216622 systemd-resolved[1518]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 00:15:05.269261 containerd[1638]: time="2025-09-09T00:15:05.269233242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dd699984b-dm7v6,Uid:237c7fbc-2d46-4bec-895d-6628ad1eb857,Namespace:calico-system,Attempt:0,} returns sandbox id \"bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31\"" Sep 9 00:15:05.298820 containerd[1638]: time="2025-09-09T00:15:05.298799424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 00:15:05.352683 systemd-networkd[1517]: vxlan.calico: Link UP Sep 9 00:15:05.352688 systemd-networkd[1517]: vxlan.calico: Gained carrier Sep 9 00:15:06.335231 systemd-networkd[1517]: calif9095c4b42c: Gained IPv6LL Sep 9 00:15:06.847264 systemd-networkd[1517]: vxlan.calico: Gained IPv6LL Sep 9 00:15:06.915771 containerd[1638]: time="2025-09-09T00:15:06.915742046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:06.916367 containerd[1638]: time="2025-09-09T00:15:06.916320367Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 00:15:06.916813 containerd[1638]: time="2025-09-09T00:15:06.916743613Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:06.919682 containerd[1638]: time="2025-09-09T00:15:06.919656596Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:06.919915 containerd[1638]: time="2025-09-09T00:15:06.919895057Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.621074182s" Sep 9 00:15:06.920025 containerd[1638]: time="2025-09-09T00:15:06.919970826Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 00:15:06.922698 containerd[1638]: time="2025-09-09T00:15:06.922655703Z" level=info msg="CreateContainer within sandbox \"bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 00:15:06.927998 containerd[1638]: time="2025-09-09T00:15:06.927971722Z" level=info msg="Container ca5d45ec6e722be9d533a306320e31ffcef816e1c7d77dc8c12cdf7937052788: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:06.930496 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2513847182.mount: Deactivated successfully. Sep 9 00:15:06.936108 containerd[1638]: time="2025-09-09T00:15:06.932517120Z" level=info msg="CreateContainer within sandbox \"bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"ca5d45ec6e722be9d533a306320e31ffcef816e1c7d77dc8c12cdf7937052788\"" Sep 9 00:15:06.936108 containerd[1638]: time="2025-09-09T00:15:06.932917227Z" level=info msg="StartContainer for \"ca5d45ec6e722be9d533a306320e31ffcef816e1c7d77dc8c12cdf7937052788\"" Sep 9 00:15:06.936108 containerd[1638]: time="2025-09-09T00:15:06.933754606Z" level=info msg="connecting to shim ca5d45ec6e722be9d533a306320e31ffcef816e1c7d77dc8c12cdf7937052788" address="unix:///run/containerd/s/eeb556bc9f0d641cf5c4d3bac2cd5209f3dee61c92a56f347295ea68dac72d42" protocol=ttrpc version=3 Sep 9 00:15:06.958279 systemd[1]: Started cri-containerd-ca5d45ec6e722be9d533a306320e31ffcef816e1c7d77dc8c12cdf7937052788.scope - libcontainer container ca5d45ec6e722be9d533a306320e31ffcef816e1c7d77dc8c12cdf7937052788. Sep 9 00:15:07.002059 containerd[1638]: time="2025-09-09T00:15:07.002039439Z" level=info msg="StartContainer for \"ca5d45ec6e722be9d533a306320e31ffcef816e1c7d77dc8c12cdf7937052788\" returns successfully" Sep 9 00:15:07.002858 containerd[1638]: time="2025-09-09T00:15:07.002802279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 00:15:08.814797 containerd[1638]: time="2025-09-09T00:15:08.814714573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-pnv8f,Uid:b151ff8e-a88a-495c-a5aa-415fe52c4a10,Namespace:kube-system,Attempt:0,}" Sep 9 00:15:08.814797 containerd[1638]: time="2025-09-09T00:15:08.814745837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ff777577d-6mmgm,Uid:9b624628-286c-47e3-a778-e3d720ed16a8,Namespace:calico-apiserver,Attempt:0,}" Sep 9 00:15:09.002618 systemd-networkd[1517]: calif9ed7c8a7c3: Link UP Sep 9 00:15:09.004498 systemd-networkd[1517]: calif9ed7c8a7c3: Gained carrier Sep 9 00:15:09.042292 containerd[1638]: 2025-09-09 00:15:08.914 [INFO][4358] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0 calico-apiserver-7ff777577d- calico-apiserver 9b624628-286c-47e3-a778-e3d720ed16a8 799 0 2025-09-09 00:14:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7ff777577d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7ff777577d-6mmgm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif9ed7c8a7c3 [] [] }} ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-6mmgm" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-" Sep 9 00:15:09.042292 containerd[1638]: 2025-09-09 00:15:08.914 [INFO][4358] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-6mmgm" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" Sep 9 00:15:09.042292 containerd[1638]: 2025-09-09 00:15:08.943 [INFO][4385] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" HandleID="k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Workload="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.944 [INFO][4385] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" HandleID="k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Workload="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d59c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7ff777577d-6mmgm", "timestamp":"2025-09-09 00:15:08.943790148 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.944 [INFO][4385] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.944 [INFO][4385] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.944 [INFO][4385] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.951 [INFO][4385] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" host="localhost" Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.959 [INFO][4385] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.970 [INFO][4385] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.973 [INFO][4385] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.975 [INFO][4385] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:09.042719 containerd[1638]: 2025-09-09 00:15:08.975 [INFO][4385] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" host="localhost" Sep 9 00:15:09.043872 containerd[1638]: 2025-09-09 00:15:08.976 [INFO][4385] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51 Sep 9 00:15:09.043872 containerd[1638]: 2025-09-09 00:15:08.982 [INFO][4385] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" host="localhost" Sep 9 00:15:09.043872 containerd[1638]: 2025-09-09 00:15:08.989 [INFO][4385] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" host="localhost" Sep 9 00:15:09.043872 containerd[1638]: 2025-09-09 00:15:08.989 [INFO][4385] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" host="localhost" Sep 9 00:15:09.043872 containerd[1638]: 2025-09-09 00:15:08.989 [INFO][4385] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 00:15:09.043872 containerd[1638]: 2025-09-09 00:15:08.990 [INFO][4385] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" HandleID="k8s-pod-network.92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Workload="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" Sep 9 00:15:09.043966 containerd[1638]: 2025-09-09 00:15:08.996 [INFO][4358] cni-plugin/k8s.go 418: Populated endpoint ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-6mmgm" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0", GenerateName:"calico-apiserver-7ff777577d-", Namespace:"calico-apiserver", SelfLink:"", UID:"9b624628-286c-47e3-a778-e3d720ed16a8", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7ff777577d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7ff777577d-6mmgm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9ed7c8a7c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:09.044180 containerd[1638]: 2025-09-09 00:15:08.996 [INFO][4358] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-6mmgm" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" Sep 9 00:15:09.044180 containerd[1638]: 2025-09-09 00:15:08.996 [INFO][4358] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9ed7c8a7c3 ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-6mmgm" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" Sep 9 00:15:09.044180 containerd[1638]: 2025-09-09 00:15:09.005 [INFO][4358] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-6mmgm" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" Sep 9 00:15:09.044250 containerd[1638]: 2025-09-09 00:15:09.008 [INFO][4358] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-6mmgm" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0", GenerateName:"calico-apiserver-7ff777577d-", Namespace:"calico-apiserver", SelfLink:"", UID:"9b624628-286c-47e3-a778-e3d720ed16a8", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7ff777577d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51", Pod:"calico-apiserver-7ff777577d-6mmgm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9ed7c8a7c3", MAC:"f6:4a:2b:4a:bd:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:09.044307 containerd[1638]: 2025-09-09 00:15:09.027 [INFO][4358] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-6mmgm" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--6mmgm-eth0" Sep 9 00:15:09.133408 systemd-networkd[1517]: cali17962073b75: Link UP Sep 9 00:15:09.134811 systemd-networkd[1517]: cali17962073b75: Gained carrier Sep 9 00:15:09.150352 containerd[1638]: time="2025-09-09T00:15:09.150314782Z" level=info msg="connecting to shim 92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51" address="unix:///run/containerd/s/d0328b682fa1e1cdd4cc2880f320444667677e4f61c7693554360126f1b3a7ee" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:15:09.168943 containerd[1638]: 2025-09-09 00:15:08.915 [INFO][4356] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0 coredns-668d6bf9bc- kube-system b151ff8e-a88a-495c-a5aa-415fe52c4a10 802 0 2025-09-09 00:14:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-pnv8f eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali17962073b75 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Namespace="kube-system" Pod="coredns-668d6bf9bc-pnv8f" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--pnv8f-" Sep 9 00:15:09.168943 containerd[1638]: 2025-09-09 00:15:08.915 [INFO][4356] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Namespace="kube-system" Pod="coredns-668d6bf9bc-pnv8f" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" Sep 9 00:15:09.168943 containerd[1638]: 2025-09-09 00:15:08.963 [INFO][4390] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" HandleID="k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Workload="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:08.963 [INFO][4390] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" HandleID="k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Workload="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4f60), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-pnv8f", "timestamp":"2025-09-09 00:15:08.963808519 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:08.964 [INFO][4390] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:08.989 [INFO][4390] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:08.989 [INFO][4390] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:09.054 [INFO][4390] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" host="localhost" Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:09.068 [INFO][4390] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:09.078 [INFO][4390] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:09.081 [INFO][4390] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:09.090 [INFO][4390] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:09.180622 containerd[1638]: 2025-09-09 00:15:09.090 [INFO][4390] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" host="localhost" Sep 9 00:15:09.180842 containerd[1638]: 2025-09-09 00:15:09.091 [INFO][4390] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b Sep 9 00:15:09.180842 containerd[1638]: 2025-09-09 00:15:09.098 [INFO][4390] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" host="localhost" Sep 9 00:15:09.180842 containerd[1638]: 2025-09-09 00:15:09.123 [INFO][4390] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" host="localhost" Sep 9 00:15:09.180842 containerd[1638]: 2025-09-09 00:15:09.124 [INFO][4390] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" host="localhost" Sep 9 00:15:09.180842 containerd[1638]: 2025-09-09 00:15:09.124 [INFO][4390] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 00:15:09.180842 containerd[1638]: 2025-09-09 00:15:09.124 [INFO][4390] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" HandleID="k8s-pod-network.375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Workload="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" Sep 9 00:15:09.183992 containerd[1638]: 2025-09-09 00:15:09.126 [INFO][4356] cni-plugin/k8s.go 418: Populated endpoint ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Namespace="kube-system" Pod="coredns-668d6bf9bc-pnv8f" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"b151ff8e-a88a-495c-a5aa-415fe52c4a10", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-pnv8f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali17962073b75", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:09.184061 containerd[1638]: 2025-09-09 00:15:09.126 [INFO][4356] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Namespace="kube-system" Pod="coredns-668d6bf9bc-pnv8f" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" Sep 9 00:15:09.184061 containerd[1638]: 2025-09-09 00:15:09.126 [INFO][4356] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali17962073b75 ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Namespace="kube-system" Pod="coredns-668d6bf9bc-pnv8f" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" Sep 9 00:15:09.184061 containerd[1638]: 2025-09-09 00:15:09.135 [INFO][4356] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Namespace="kube-system" Pod="coredns-668d6bf9bc-pnv8f" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" Sep 9 00:15:09.192594 containerd[1638]: 2025-09-09 00:15:09.136 [INFO][4356] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Namespace="kube-system" Pod="coredns-668d6bf9bc-pnv8f" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"b151ff8e-a88a-495c-a5aa-415fe52c4a10", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b", Pod:"coredns-668d6bf9bc-pnv8f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali17962073b75", MAC:"1a:f9:32:52:67:7d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:09.192594 containerd[1638]: 2025-09-09 00:15:09.165 [INFO][4356] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" Namespace="kube-system" Pod="coredns-668d6bf9bc-pnv8f" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--pnv8f-eth0" Sep 9 00:15:09.253623 systemd[1]: Started cri-containerd-92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51.scope - libcontainer container 92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51. Sep 9 00:15:09.258160 containerd[1638]: time="2025-09-09T00:15:09.257620173Z" level=info msg="connecting to shim 375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b" address="unix:///run/containerd/s/7838de0126ea90719b18925ad39eae5892a25f52ebf60236bba92e3cce9adadb" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:15:09.280195 systemd-resolved[1518]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 00:15:09.305315 systemd[1]: Started cri-containerd-375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b.scope - libcontainer container 375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b. Sep 9 00:15:09.322673 systemd-resolved[1518]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 00:15:09.385350 containerd[1638]: time="2025-09-09T00:15:09.385036589Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-pnv8f,Uid:b151ff8e-a88a-495c-a5aa-415fe52c4a10,Namespace:kube-system,Attempt:0,} returns sandbox id \"375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b\"" Sep 9 00:15:09.396197 containerd[1638]: time="2025-09-09T00:15:09.396168620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ff777577d-6mmgm,Uid:9b624628-286c-47e3-a778-e3d720ed16a8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51\"" Sep 9 00:15:09.396919 containerd[1638]: time="2025-09-09T00:15:09.396636465Z" level=info msg="CreateContainer within sandbox \"375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 00:15:09.532089 containerd[1638]: time="2025-09-09T00:15:09.532056791Z" level=info msg="Container 47f2bff176f568a5e212c06d0c01804c33f5f24eb1d9c52840f7697b81f5acc4: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:09.540332 containerd[1638]: time="2025-09-09T00:15:09.540309064Z" level=info msg="CreateContainer within sandbox \"375dc70e497b412d7442c190ec8527e3ac401a283d84e71d1441f3ecdb4afa7b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"47f2bff176f568a5e212c06d0c01804c33f5f24eb1d9c52840f7697b81f5acc4\"" Sep 9 00:15:09.541267 containerd[1638]: time="2025-09-09T00:15:09.541170416Z" level=info msg="StartContainer for \"47f2bff176f568a5e212c06d0c01804c33f5f24eb1d9c52840f7697b81f5acc4\"" Sep 9 00:15:09.541950 containerd[1638]: time="2025-09-09T00:15:09.541935846Z" level=info msg="connecting to shim 47f2bff176f568a5e212c06d0c01804c33f5f24eb1d9c52840f7697b81f5acc4" address="unix:///run/containerd/s/7838de0126ea90719b18925ad39eae5892a25f52ebf60236bba92e3cce9adadb" protocol=ttrpc version=3 Sep 9 00:15:09.569418 systemd[1]: Started cri-containerd-47f2bff176f568a5e212c06d0c01804c33f5f24eb1d9c52840f7697b81f5acc4.scope - libcontainer container 47f2bff176f568a5e212c06d0c01804c33f5f24eb1d9c52840f7697b81f5acc4. Sep 9 00:15:09.589293 containerd[1638]: time="2025-09-09T00:15:09.589204336Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:09.590253 containerd[1638]: time="2025-09-09T00:15:09.590237110Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 00:15:09.590541 containerd[1638]: time="2025-09-09T00:15:09.590529900Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:09.593429 containerd[1638]: time="2025-09-09T00:15:09.593393414Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:09.595688 containerd[1638]: time="2025-09-09T00:15:09.595546589Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.592710383s" Sep 9 00:15:09.595688 containerd[1638]: time="2025-09-09T00:15:09.595566396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 00:15:09.597371 containerd[1638]: time="2025-09-09T00:15:09.597311563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 00:15:09.598252 containerd[1638]: time="2025-09-09T00:15:09.598238320Z" level=info msg="CreateContainer within sandbox \"bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 00:15:09.602927 containerd[1638]: time="2025-09-09T00:15:09.602504895Z" level=info msg="Container 7fd7ff1420481441f381abb53279634763c3f6c17094edf3c79e473b4e929a44: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:09.620936 containerd[1638]: time="2025-09-09T00:15:09.620915737Z" level=info msg="StartContainer for \"47f2bff176f568a5e212c06d0c01804c33f5f24eb1d9c52840f7697b81f5acc4\" returns successfully" Sep 9 00:15:09.626414 containerd[1638]: time="2025-09-09T00:15:09.626386380Z" level=info msg="CreateContainer within sandbox \"bc60aa8c8596cc7dbae054d90b890ce86747cad608aa59e04c5f7513f0aded31\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"7fd7ff1420481441f381abb53279634763c3f6c17094edf3c79e473b4e929a44\"" Sep 9 00:15:09.628245 containerd[1638]: time="2025-09-09T00:15:09.628217679Z" level=info msg="StartContainer for \"7fd7ff1420481441f381abb53279634763c3f6c17094edf3c79e473b4e929a44\"" Sep 9 00:15:09.629808 containerd[1638]: time="2025-09-09T00:15:09.629787083Z" level=info msg="connecting to shim 7fd7ff1420481441f381abb53279634763c3f6c17094edf3c79e473b4e929a44" address="unix:///run/containerd/s/eeb556bc9f0d641cf5c4d3bac2cd5209f3dee61c92a56f347295ea68dac72d42" protocol=ttrpc version=3 Sep 9 00:15:09.652351 systemd[1]: Started cri-containerd-7fd7ff1420481441f381abb53279634763c3f6c17094edf3c79e473b4e929a44.scope - libcontainer container 7fd7ff1420481441f381abb53279634763c3f6c17094edf3c79e473b4e929a44. Sep 9 00:15:09.703191 containerd[1638]: time="2025-09-09T00:15:09.703138011Z" level=info msg="StartContainer for \"7fd7ff1420481441f381abb53279634763c3f6c17094edf3c79e473b4e929a44\" returns successfully" Sep 9 00:15:09.863572 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3726983193.mount: Deactivated successfully. Sep 9 00:15:10.218909 kubelet[2924]: I0909 00:15:10.211452 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-pnv8f" podStartSLOduration=42.179547553 podStartE2EDuration="42.179547553s" podCreationTimestamp="2025-09-09 00:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 00:15:10.176885603 +0000 UTC m=+47.455056357" watchObservedRunningTime="2025-09-09 00:15:10.179547553 +0000 UTC m=+47.457718305" Sep 9 00:15:10.219963 kubelet[2924]: I0909 00:15:10.219412 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5dd699984b-dm7v6" podStartSLOduration=1.921190947 podStartE2EDuration="6.219400504s" podCreationTimestamp="2025-09-09 00:15:04 +0000 UTC" firstStartedPulling="2025-09-09 00:15:05.298537782 +0000 UTC m=+42.576708530" lastFinishedPulling="2025-09-09 00:15:09.596747337 +0000 UTC m=+46.874918087" observedRunningTime="2025-09-09 00:15:10.219256674 +0000 UTC m=+47.497427425" watchObservedRunningTime="2025-09-09 00:15:10.219400504 +0000 UTC m=+47.497571255" Sep 9 00:15:10.815908 containerd[1638]: time="2025-09-09T00:15:10.815380268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bgldd,Uid:dab4fb84-134e-402d-a321-1001c52517c9,Namespace:calico-system,Attempt:0,}" Sep 9 00:15:10.815727 systemd-networkd[1517]: calif9ed7c8a7c3: Gained IPv6LL Sep 9 00:15:10.907014 systemd-networkd[1517]: cali25ff11c671f: Link UP Sep 9 00:15:10.908428 systemd-networkd[1517]: cali25ff11c671f: Gained carrier Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.863 [INFO][4592] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--bgldd-eth0 goldmane-54d579b49d- calico-system dab4fb84-134e-402d-a321-1001c52517c9 805 0 2025-09-09 00:14:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-bgldd eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali25ff11c671f [] [] }} ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Namespace="calico-system" Pod="goldmane-54d579b49d-bgldd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bgldd-" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.864 [INFO][4592] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Namespace="calico-system" Pod="goldmane-54d579b49d-bgldd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.879 [INFO][4604] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" HandleID="k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Workload="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.879 [INFO][4604] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" HandleID="k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Workload="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f1f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-bgldd", "timestamp":"2025-09-09 00:15:10.87931891 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.879 [INFO][4604] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.879 [INFO][4604] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.879 [INFO][4604] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.885 [INFO][4604] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.888 [INFO][4604] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.891 [INFO][4604] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.892 [INFO][4604] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.893 [INFO][4604] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.893 [INFO][4604] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.894 [INFO][4604] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.896 [INFO][4604] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.899 [INFO][4604] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.900 [INFO][4604] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" host="localhost" Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.900 [INFO][4604] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 00:15:10.920206 containerd[1638]: 2025-09-09 00:15:10.900 [INFO][4604] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" HandleID="k8s-pod-network.19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Workload="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" Sep 9 00:15:10.922187 containerd[1638]: 2025-09-09 00:15:10.903 [INFO][4592] cni-plugin/k8s.go 418: Populated endpoint ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Namespace="calico-system" Pod="goldmane-54d579b49d-bgldd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--bgldd-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"dab4fb84-134e-402d-a321-1001c52517c9", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-bgldd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali25ff11c671f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:10.922187 containerd[1638]: 2025-09-09 00:15:10.903 [INFO][4592] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Namespace="calico-system" Pod="goldmane-54d579b49d-bgldd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" Sep 9 00:15:10.922187 containerd[1638]: 2025-09-09 00:15:10.903 [INFO][4592] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali25ff11c671f ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Namespace="calico-system" Pod="goldmane-54d579b49d-bgldd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" Sep 9 00:15:10.922187 containerd[1638]: 2025-09-09 00:15:10.908 [INFO][4592] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Namespace="calico-system" Pod="goldmane-54d579b49d-bgldd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" Sep 9 00:15:10.922187 containerd[1638]: 2025-09-09 00:15:10.909 [INFO][4592] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Namespace="calico-system" Pod="goldmane-54d579b49d-bgldd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--bgldd-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"dab4fb84-134e-402d-a321-1001c52517c9", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef", Pod:"goldmane-54d579b49d-bgldd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali25ff11c671f", MAC:"c2:af:04:44:42:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:10.922187 containerd[1638]: 2025-09-09 00:15:10.916 [INFO][4592] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" Namespace="calico-system" Pod="goldmane-54d579b49d-bgldd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bgldd-eth0" Sep 9 00:15:10.951619 containerd[1638]: time="2025-09-09T00:15:10.951563229Z" level=info msg="connecting to shim 19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef" address="unix:///run/containerd/s/85103475c6e2931cd1353b9aea8c55d412a2f07292730bf40ee95abd3c3feab4" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:15:10.978254 systemd[1]: Started cri-containerd-19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef.scope - libcontainer container 19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef. Sep 9 00:15:10.994860 systemd-resolved[1518]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 00:15:11.037811 containerd[1638]: time="2025-09-09T00:15:11.037751558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bgldd,Uid:dab4fb84-134e-402d-a321-1001c52517c9,Namespace:calico-system,Attempt:0,} returns sandbox id \"19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef\"" Sep 9 00:15:11.071240 systemd-networkd[1517]: cali17962073b75: Gained IPv6LL Sep 9 00:15:11.815048 containerd[1638]: time="2025-09-09T00:15:11.815009677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d75874cd-5q9th,Uid:466bc793-f266-46d4-b563-49ff42f7b1ae,Namespace:calico-system,Attempt:0,}" Sep 9 00:15:12.159209 systemd-networkd[1517]: cali25ff11c671f: Gained IPv6LL Sep 9 00:15:12.898415 containerd[1638]: time="2025-09-09T00:15:12.898251344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-r6qzt,Uid:37daf8f3-9d85-49c3-ba4d-11feef4e10ab,Namespace:kube-system,Attempt:0,}" Sep 9 00:15:12.898923 containerd[1638]: time="2025-09-09T00:15:12.898749612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j6h4t,Uid:2778c8f3-9140-419e-8f7e-95ecd474a55a,Namespace:calico-system,Attempt:0,}" Sep 9 00:15:12.898998 containerd[1638]: time="2025-09-09T00:15:12.898982386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ff777577d-c77vc,Uid:9319c330-61e7-4f2f-810d-926a85cebaee,Namespace:calico-apiserver,Attempt:0,}" Sep 9 00:15:13.122554 systemd-networkd[1517]: cali221b94ff8f6: Link UP Sep 9 00:15:13.123687 systemd-networkd[1517]: cali221b94ff8f6: Gained carrier Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.005 [INFO][4680] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0 coredns-668d6bf9bc- kube-system 37daf8f3-9d85-49c3-ba4d-11feef4e10ab 796 0 2025-09-09 00:14:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-r6qzt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali221b94ff8f6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Namespace="kube-system" Pod="coredns-668d6bf9bc-r6qzt" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--r6qzt-" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.005 [INFO][4680] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Namespace="kube-system" Pod="coredns-668d6bf9bc-r6qzt" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.077 [INFO][4704] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" HandleID="k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Workload="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.077 [INFO][4704] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" HandleID="k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Workload="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f9a0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-r6qzt", "timestamp":"2025-09-09 00:15:13.077522563 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.077 [INFO][4704] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.077 [INFO][4704] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.077 [INFO][4704] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.087 [INFO][4704] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.089 [INFO][4704] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.091 [INFO][4704] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.092 [INFO][4704] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.093 [INFO][4704] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.093 [INFO][4704] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.094 [INFO][4704] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223 Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.100 [INFO][4704] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.117 [INFO][4704] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.117 [INFO][4704] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" host="localhost" Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.117 [INFO][4704] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 00:15:13.150954 containerd[1638]: 2025-09-09 00:15:13.117 [INFO][4704] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" HandleID="k8s-pod-network.82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Workload="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" Sep 9 00:15:13.428391 containerd[1638]: 2025-09-09 00:15:13.120 [INFO][4680] cni-plugin/k8s.go 418: Populated endpoint ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Namespace="kube-system" Pod="coredns-668d6bf9bc-r6qzt" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"37daf8f3-9d85-49c3-ba4d-11feef4e10ab", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-r6qzt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali221b94ff8f6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:13.428391 containerd[1638]: 2025-09-09 00:15:13.120 [INFO][4680] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Namespace="kube-system" Pod="coredns-668d6bf9bc-r6qzt" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" Sep 9 00:15:13.428391 containerd[1638]: 2025-09-09 00:15:13.120 [INFO][4680] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali221b94ff8f6 ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Namespace="kube-system" Pod="coredns-668d6bf9bc-r6qzt" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" Sep 9 00:15:13.428391 containerd[1638]: 2025-09-09 00:15:13.124 [INFO][4680] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Namespace="kube-system" Pod="coredns-668d6bf9bc-r6qzt" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" Sep 9 00:15:13.428391 containerd[1638]: 2025-09-09 00:15:13.124 [INFO][4680] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Namespace="kube-system" Pod="coredns-668d6bf9bc-r6qzt" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"37daf8f3-9d85-49c3-ba4d-11feef4e10ab", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223", Pod:"coredns-668d6bf9bc-r6qzt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali221b94ff8f6", MAC:"f2:49:7e:b4:97:fb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:13.428391 containerd[1638]: 2025-09-09 00:15:13.146 [INFO][4680] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" Namespace="kube-system" Pod="coredns-668d6bf9bc-r6qzt" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--r6qzt-eth0" Sep 9 00:15:13.303158 systemd-networkd[1517]: calic46e4b5653a: Link UP Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.006 [INFO][4670] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0 calico-kube-controllers-5d75874cd- calico-system 466bc793-f266-46d4-b563-49ff42f7b1ae 806 0 2025-09-09 00:14:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5d75874cd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-5d75874cd-5q9th eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic46e4b5653a [] [] }} ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Namespace="calico-system" Pod="calico-kube-controllers-5d75874cd-5q9th" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.006 [INFO][4670] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Namespace="calico-system" Pod="calico-kube-controllers-5d75874cd-5q9th" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.078 [INFO][4699] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" HandleID="k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Workload="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.078 [INFO][4699] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" HandleID="k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Workload="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003539e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-5d75874cd-5q9th", "timestamp":"2025-09-09 00:15:13.078327583 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.078 [INFO][4699] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.117 [INFO][4699] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.117 [INFO][4699] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.187 [INFO][4699] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.190 [INFO][4699] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.238 [INFO][4699] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.239 [INFO][4699] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.240 [INFO][4699] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.240 [INFO][4699] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.241 [INFO][4699] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833 Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.264 [INFO][4699] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.291 [INFO][4699] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.296 [INFO][4699] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" host="localhost" Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.297 [INFO][4699] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 00:15:13.451462 containerd[1638]: 2025-09-09 00:15:13.297 [INFO][4699] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" HandleID="k8s-pod-network.4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Workload="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" Sep 9 00:15:13.303702 systemd-networkd[1517]: calic46e4b5653a: Gained carrier Sep 9 00:15:13.473957 containerd[1638]: 2025-09-09 00:15:13.299 [INFO][4670] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Namespace="calico-system" Pod="calico-kube-controllers-5d75874cd-5q9th" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0", GenerateName:"calico-kube-controllers-5d75874cd-", Namespace:"calico-system", SelfLink:"", UID:"466bc793-f266-46d4-b563-49ff42f7b1ae", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d75874cd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-5d75874cd-5q9th", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic46e4b5653a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:13.473957 containerd[1638]: 2025-09-09 00:15:13.299 [INFO][4670] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Namespace="calico-system" Pod="calico-kube-controllers-5d75874cd-5q9th" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" Sep 9 00:15:13.473957 containerd[1638]: 2025-09-09 00:15:13.299 [INFO][4670] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic46e4b5653a ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Namespace="calico-system" Pod="calico-kube-controllers-5d75874cd-5q9th" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" Sep 9 00:15:13.473957 containerd[1638]: 2025-09-09 00:15:13.303 [INFO][4670] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Namespace="calico-system" Pod="calico-kube-controllers-5d75874cd-5q9th" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" Sep 9 00:15:13.473957 containerd[1638]: 2025-09-09 00:15:13.304 [INFO][4670] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Namespace="calico-system" Pod="calico-kube-controllers-5d75874cd-5q9th" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0", GenerateName:"calico-kube-controllers-5d75874cd-", Namespace:"calico-system", SelfLink:"", UID:"466bc793-f266-46d4-b563-49ff42f7b1ae", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d75874cd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833", Pod:"calico-kube-controllers-5d75874cd-5q9th", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic46e4b5653a", MAC:"62:53:33:5a:6b:fe", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:13.473957 containerd[1638]: 2025-09-09 00:15:13.355 [INFO][4670] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" Namespace="calico-system" Pod="calico-kube-controllers-5d75874cd-5q9th" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5d75874cd--5q9th-eth0" Sep 9 00:15:13.551202 systemd-networkd[1517]: cali086d8c37bdc: Link UP Sep 9 00:15:13.552417 systemd-networkd[1517]: cali086d8c37bdc: Gained carrier Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.451 [INFO][4737] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--j6h4t-eth0 csi-node-driver- calico-system 2778c8f3-9140-419e-8f7e-95ecd474a55a 671 0 2025-09-09 00:14:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-j6h4t eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali086d8c37bdc [] [] }} ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Namespace="calico-system" Pod="csi-node-driver-j6h4t" WorkloadEndpoint="localhost-k8s-csi--node--driver--j6h4t-" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.451 [INFO][4737] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Namespace="calico-system" Pod="csi-node-driver-j6h4t" WorkloadEndpoint="localhost-k8s-csi--node--driver--j6h4t-eth0" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.468 [INFO][4750] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" HandleID="k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Workload="localhost-k8s-csi--node--driver--j6h4t-eth0" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.468 [INFO][4750] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" HandleID="k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Workload="localhost-k8s-csi--node--driver--j6h4t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f640), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-j6h4t", "timestamp":"2025-09-09 00:15:13.468122843 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.468 [INFO][4750] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.468 [INFO][4750] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.468 [INFO][4750] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.485 [INFO][4750] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.488 [INFO][4750] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.490 [INFO][4750] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.494 [INFO][4750] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.496 [INFO][4750] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.496 [INFO][4750] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.498 [INFO][4750] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.518 [INFO][4750] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.542 [INFO][4750] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.542 [INFO][4750] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" host="localhost" Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.542 [INFO][4750] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 00:15:13.571360 containerd[1638]: 2025-09-09 00:15:13.542 [INFO][4750] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" HandleID="k8s-pod-network.c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Workload="localhost-k8s-csi--node--driver--j6h4t-eth0" Sep 9 00:15:13.572105 containerd[1638]: 2025-09-09 00:15:13.544 [INFO][4737] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Namespace="calico-system" Pod="csi-node-driver-j6h4t" WorkloadEndpoint="localhost-k8s-csi--node--driver--j6h4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--j6h4t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2778c8f3-9140-419e-8f7e-95ecd474a55a", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-j6h4t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali086d8c37bdc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:13.572105 containerd[1638]: 2025-09-09 00:15:13.544 [INFO][4737] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Namespace="calico-system" Pod="csi-node-driver-j6h4t" WorkloadEndpoint="localhost-k8s-csi--node--driver--j6h4t-eth0" Sep 9 00:15:13.572105 containerd[1638]: 2025-09-09 00:15:13.544 [INFO][4737] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali086d8c37bdc ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Namespace="calico-system" Pod="csi-node-driver-j6h4t" WorkloadEndpoint="localhost-k8s-csi--node--driver--j6h4t-eth0" Sep 9 00:15:13.572105 containerd[1638]: 2025-09-09 00:15:13.552 [INFO][4737] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Namespace="calico-system" Pod="csi-node-driver-j6h4t" WorkloadEndpoint="localhost-k8s-csi--node--driver--j6h4t-eth0" Sep 9 00:15:13.572105 containerd[1638]: 2025-09-09 00:15:13.553 [INFO][4737] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Namespace="calico-system" Pod="csi-node-driver-j6h4t" WorkloadEndpoint="localhost-k8s-csi--node--driver--j6h4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--j6h4t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2778c8f3-9140-419e-8f7e-95ecd474a55a", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e", Pod:"csi-node-driver-j6h4t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali086d8c37bdc", MAC:"7e:38:66:17:89:a0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:13.572105 containerd[1638]: 2025-09-09 00:15:13.569 [INFO][4737] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" Namespace="calico-system" Pod="csi-node-driver-j6h4t" WorkloadEndpoint="localhost-k8s-csi--node--driver--j6h4t-eth0" Sep 9 00:15:13.657518 systemd-networkd[1517]: cali998dbd384e2: Link UP Sep 9 00:15:13.658146 systemd-networkd[1517]: cali998dbd384e2: Gained carrier Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.520 [INFO][4757] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0 calico-apiserver-7ff777577d- calico-apiserver 9319c330-61e7-4f2f-810d-926a85cebaee 804 0 2025-09-09 00:14:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7ff777577d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7ff777577d-c77vc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali998dbd384e2 [] [] }} ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-c77vc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--c77vc-" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.521 [INFO][4757] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-c77vc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.591 [INFO][4770] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" HandleID="k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Workload="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.591 [INFO][4770] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" HandleID="k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Workload="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7ff777577d-c77vc", "timestamp":"2025-09-09 00:15:13.591358208 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.591 [INFO][4770] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.591 [INFO][4770] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.591 [INFO][4770] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.603 [INFO][4770] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.606 [INFO][4770] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.609 [INFO][4770] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.610 [INFO][4770] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.611 [INFO][4770] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.611 [INFO][4770] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.612 [INFO][4770] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03 Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.631 [INFO][4770] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.653 [INFO][4770] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.653 [INFO][4770] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" host="localhost" Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.653 [INFO][4770] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 00:15:13.702232 containerd[1638]: 2025-09-09 00:15:13.653 [INFO][4770] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" HandleID="k8s-pod-network.548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Workload="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" Sep 9 00:15:13.725366 containerd[1638]: 2025-09-09 00:15:13.655 [INFO][4757] cni-plugin/k8s.go 418: Populated endpoint ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-c77vc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0", GenerateName:"calico-apiserver-7ff777577d-", Namespace:"calico-apiserver", SelfLink:"", UID:"9319c330-61e7-4f2f-810d-926a85cebaee", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7ff777577d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7ff777577d-c77vc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali998dbd384e2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:13.725366 containerd[1638]: 2025-09-09 00:15:13.655 [INFO][4757] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-c77vc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" Sep 9 00:15:13.725366 containerd[1638]: 2025-09-09 00:15:13.655 [INFO][4757] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali998dbd384e2 ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-c77vc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" Sep 9 00:15:13.725366 containerd[1638]: 2025-09-09 00:15:13.658 [INFO][4757] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-c77vc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" Sep 9 00:15:13.725366 containerd[1638]: 2025-09-09 00:15:13.658 [INFO][4757] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-c77vc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0", GenerateName:"calico-apiserver-7ff777577d-", Namespace:"calico-apiserver", SelfLink:"", UID:"9319c330-61e7-4f2f-810d-926a85cebaee", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 0, 14, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7ff777577d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03", Pod:"calico-apiserver-7ff777577d-c77vc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali998dbd384e2", MAC:"a2:23:97:3b:24:e3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 00:15:13.725366 containerd[1638]: 2025-09-09 00:15:13.700 [INFO][4757] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" Namespace="calico-apiserver" Pod="calico-apiserver-7ff777577d-c77vc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7ff777577d--c77vc-eth0" Sep 9 00:15:14.076655 containerd[1638]: time="2025-09-09T00:15:14.076624499Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:14.143357 systemd-networkd[1517]: cali221b94ff8f6: Gained IPv6LL Sep 9 00:15:14.286007 containerd[1638]: time="2025-09-09T00:15:14.285945841Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 00:15:14.401297 containerd[1638]: time="2025-09-09T00:15:14.401175000Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:14.497453 containerd[1638]: time="2025-09-09T00:15:14.497276466Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:14.502805 containerd[1638]: time="2025-09-09T00:15:14.502767374Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 4.905435683s" Sep 9 00:15:14.502912 containerd[1638]: time="2025-09-09T00:15:14.502901974Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 00:15:14.507516 containerd[1638]: time="2025-09-09T00:15:14.507492477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 00:15:14.509380 containerd[1638]: time="2025-09-09T00:15:14.509362302Z" level=info msg="CreateContainer within sandbox \"92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 00:15:14.517274 containerd[1638]: time="2025-09-09T00:15:14.516484354Z" level=info msg="connecting to shim 4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833" address="unix:///run/containerd/s/f82bbea83d044523bcf6261a896a43ca1006665db474e5088833b21836c0a058" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:15:14.518841 containerd[1638]: time="2025-09-09T00:15:14.518821328Z" level=info msg="connecting to shim 82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223" address="unix:///run/containerd/s/568a51d80d5a5ee67d346f30414beea6270e604262b32149d05f177bd827b625" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:15:14.525366 containerd[1638]: time="2025-09-09T00:15:14.525340785Z" level=info msg="connecting to shim c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e" address="unix:///run/containerd/s/16724de8f8a7bf76bb400d1f708b044651da94dbd149504f8261ffdf50986e3e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:15:14.525901 containerd[1638]: time="2025-09-09T00:15:14.525880808Z" level=info msg="Container 50f40ddb56b4c8eecac1e09c391ffd1191e99ae0f8faff592cd2f1d59ed8a4ce: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:14.532996 containerd[1638]: time="2025-09-09T00:15:14.532963961Z" level=info msg="connecting to shim 548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03" address="unix:///run/containerd/s/847417d5e2be904329f31dd9bbe56614ba43aa4cbb4d3d8b4a977381137a824b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 00:15:14.538207 containerd[1638]: time="2025-09-09T00:15:14.538181699Z" level=info msg="CreateContainer within sandbox \"92323a44d18bad275a0ff0e72a00b6bf64e7f2bf5fd18c647d7ee2a473bffa51\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"50f40ddb56b4c8eecac1e09c391ffd1191e99ae0f8faff592cd2f1d59ed8a4ce\"" Sep 9 00:15:14.540725 containerd[1638]: time="2025-09-09T00:15:14.540239920Z" level=info msg="StartContainer for \"50f40ddb56b4c8eecac1e09c391ffd1191e99ae0f8faff592cd2f1d59ed8a4ce\"" Sep 9 00:15:14.541764 containerd[1638]: time="2025-09-09T00:15:14.541745585Z" level=info msg="connecting to shim 50f40ddb56b4c8eecac1e09c391ffd1191e99ae0f8faff592cd2f1d59ed8a4ce" address="unix:///run/containerd/s/d0328b682fa1e1cdd4cc2880f320444667677e4f61c7693554360126f1b3a7ee" protocol=ttrpc version=3 Sep 9 00:15:14.561335 systemd[1]: Started cri-containerd-4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833.scope - libcontainer container 4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833. Sep 9 00:15:14.580529 systemd[1]: Started cri-containerd-82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223.scope - libcontainer container 82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223. Sep 9 00:15:14.599298 systemd[1]: Started cri-containerd-50f40ddb56b4c8eecac1e09c391ffd1191e99ae0f8faff592cd2f1d59ed8a4ce.scope - libcontainer container 50f40ddb56b4c8eecac1e09c391ffd1191e99ae0f8faff592cd2f1d59ed8a4ce. Sep 9 00:15:14.603999 systemd-resolved[1518]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 00:15:14.609332 systemd[1]: Started cri-containerd-c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e.scope - libcontainer container c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e. Sep 9 00:15:14.622897 systemd[1]: Started cri-containerd-548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03.scope - libcontainer container 548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03. Sep 9 00:15:14.628960 systemd-resolved[1518]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 00:15:14.679315 containerd[1638]: time="2025-09-09T00:15:14.678764480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-r6qzt,Uid:37daf8f3-9d85-49c3-ba4d-11feef4e10ab,Namespace:kube-system,Attempt:0,} returns sandbox id \"82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223\"" Sep 9 00:15:14.679001 systemd-resolved[1518]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 00:15:14.707777 systemd-resolved[1518]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 00:15:14.719291 systemd-networkd[1517]: cali086d8c37bdc: Gained IPv6LL Sep 9 00:15:14.738103 containerd[1638]: time="2025-09-09T00:15:14.737999233Z" level=info msg="CreateContainer within sandbox \"82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 00:15:14.803792 containerd[1638]: time="2025-09-09T00:15:14.803212469Z" level=info msg="Container 31fedb7ea298d7d69d2ff0dc182d86824f53f465f1bb9b6e29947ffffc08ac36: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:14.829759 containerd[1638]: time="2025-09-09T00:15:14.829694208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j6h4t,Uid:2778c8f3-9140-419e-8f7e-95ecd474a55a,Namespace:calico-system,Attempt:0,} returns sandbox id \"c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e\"" Sep 9 00:15:14.829862 containerd[1638]: time="2025-09-09T00:15:14.829787098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d75874cd-5q9th,Uid:466bc793-f266-46d4-b563-49ff42f7b1ae,Namespace:calico-system,Attempt:0,} returns sandbox id \"4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833\"" Sep 9 00:15:14.842808 containerd[1638]: time="2025-09-09T00:15:14.842752056Z" level=info msg="StartContainer for \"50f40ddb56b4c8eecac1e09c391ffd1191e99ae0f8faff592cd2f1d59ed8a4ce\" returns successfully" Sep 9 00:15:14.877560 containerd[1638]: time="2025-09-09T00:15:14.877517189Z" level=info msg="CreateContainer within sandbox \"82936021c77e4f236a93b782d162a4b8e0c59cf0a6c85ab7ea7ee8990c744223\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"31fedb7ea298d7d69d2ff0dc182d86824f53f465f1bb9b6e29947ffffc08ac36\"" Sep 9 00:15:14.879509 containerd[1638]: time="2025-09-09T00:15:14.879480608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ff777577d-c77vc,Uid:9319c330-61e7-4f2f-810d-926a85cebaee,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03\"" Sep 9 00:15:14.880088 containerd[1638]: time="2025-09-09T00:15:14.880071616Z" level=info msg="StartContainer for \"31fedb7ea298d7d69d2ff0dc182d86824f53f465f1bb9b6e29947ffffc08ac36\"" Sep 9 00:15:14.881952 containerd[1638]: time="2025-09-09T00:15:14.881930663Z" level=info msg="connecting to shim 31fedb7ea298d7d69d2ff0dc182d86824f53f465f1bb9b6e29947ffffc08ac36" address="unix:///run/containerd/s/568a51d80d5a5ee67d346f30414beea6270e604262b32149d05f177bd827b625" protocol=ttrpc version=3 Sep 9 00:15:14.885935 containerd[1638]: time="2025-09-09T00:15:14.885757610Z" level=info msg="CreateContainer within sandbox \"548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 00:15:14.903620 containerd[1638]: time="2025-09-09T00:15:14.903598563Z" level=info msg="Container 62e0968879c0d771823072420313b8dbb9575ede3c182ec037e8dadc91c25e76: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:14.919568 systemd[1]: Started cri-containerd-31fedb7ea298d7d69d2ff0dc182d86824f53f465f1bb9b6e29947ffffc08ac36.scope - libcontainer container 31fedb7ea298d7d69d2ff0dc182d86824f53f465f1bb9b6e29947ffffc08ac36. Sep 9 00:15:14.945509 containerd[1638]: time="2025-09-09T00:15:14.945442969Z" level=info msg="CreateContainer within sandbox \"548e42e92fb4243bee17ef97d5ca9c90c02de7b18e22fd7881f4d86b78d6dd03\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"62e0968879c0d771823072420313b8dbb9575ede3c182ec037e8dadc91c25e76\"" Sep 9 00:15:14.947070 containerd[1638]: time="2025-09-09T00:15:14.946370275Z" level=info msg="StartContainer for \"62e0968879c0d771823072420313b8dbb9575ede3c182ec037e8dadc91c25e76\"" Sep 9 00:15:14.947070 containerd[1638]: time="2025-09-09T00:15:14.946974324Z" level=info msg="connecting to shim 62e0968879c0d771823072420313b8dbb9575ede3c182ec037e8dadc91c25e76" address="unix:///run/containerd/s/847417d5e2be904329f31dd9bbe56614ba43aa4cbb4d3d8b4a977381137a824b" protocol=ttrpc version=3 Sep 9 00:15:14.961998 containerd[1638]: time="2025-09-09T00:15:14.961976735Z" level=info msg="StartContainer for \"31fedb7ea298d7d69d2ff0dc182d86824f53f465f1bb9b6e29947ffffc08ac36\" returns successfully" Sep 9 00:15:14.977670 systemd[1]: Started cri-containerd-62e0968879c0d771823072420313b8dbb9575ede3c182ec037e8dadc91c25e76.scope - libcontainer container 62e0968879c0d771823072420313b8dbb9575ede3c182ec037e8dadc91c25e76. Sep 9 00:15:15.039298 systemd-networkd[1517]: cali998dbd384e2: Gained IPv6LL Sep 9 00:15:15.044476 containerd[1638]: time="2025-09-09T00:15:15.044447049Z" level=info msg="StartContainer for \"62e0968879c0d771823072420313b8dbb9575ede3c182ec037e8dadc91c25e76\" returns successfully" Sep 9 00:15:15.103877 systemd-networkd[1517]: calic46e4b5653a: Gained IPv6LL Sep 9 00:15:15.129931 kubelet[2924]: I0909 00:15:15.129880 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7ff777577d-6mmgm" podStartSLOduration=32.015266925 podStartE2EDuration="37.121308478s" podCreationTimestamp="2025-09-09 00:14:38 +0000 UTC" firstStartedPulling="2025-09-09 00:15:09.398132639 +0000 UTC m=+46.676303387" lastFinishedPulling="2025-09-09 00:15:14.50417419 +0000 UTC m=+51.782344940" observedRunningTime="2025-09-09 00:15:15.120556251 +0000 UTC m=+52.398727003" watchObservedRunningTime="2025-09-09 00:15:15.121308478 +0000 UTC m=+52.399479235" Sep 9 00:15:15.138409 kubelet[2924]: I0909 00:15:15.138372 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-r6qzt" podStartSLOduration=47.138360812 podStartE2EDuration="47.138360812s" podCreationTimestamp="2025-09-09 00:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 00:15:15.137350201 +0000 UTC m=+52.415520953" watchObservedRunningTime="2025-09-09 00:15:15.138360812 +0000 UTC m=+52.416531564" Sep 9 00:15:15.169012 kubelet[2924]: I0909 00:15:15.168968 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7ff777577d-c77vc" podStartSLOduration=37.168956885 podStartE2EDuration="37.168956885s" podCreationTimestamp="2025-09-09 00:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 00:15:15.149619659 +0000 UTC m=+52.427790422" watchObservedRunningTime="2025-09-09 00:15:15.168956885 +0000 UTC m=+52.447127637" Sep 9 00:15:16.126792 kubelet[2924]: I0909 00:15:16.119153 2924 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 00:15:18.601305 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount154896535.mount: Deactivated successfully. Sep 9 00:15:21.466044 containerd[1638]: time="2025-09-09T00:15:21.466005786Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:21.513553 containerd[1638]: time="2025-09-09T00:15:21.498387694Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 00:15:21.570264 containerd[1638]: time="2025-09-09T00:15:21.570203670Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:21.651874 containerd[1638]: time="2025-09-09T00:15:21.651827054Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:21.661283 containerd[1638]: time="2025-09-09T00:15:21.653777415Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 7.144574551s" Sep 9 00:15:21.661283 containerd[1638]: time="2025-09-09T00:15:21.653799257Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 00:15:22.007954 containerd[1638]: time="2025-09-09T00:15:22.007919803Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 00:15:22.736033 containerd[1638]: time="2025-09-09T00:15:22.735620207Z" level=info msg="CreateContainer within sandbox \"19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 00:15:22.746814 containerd[1638]: time="2025-09-09T00:15:22.746780597Z" level=info msg="Container 257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:22.777398 containerd[1638]: time="2025-09-09T00:15:22.777292863Z" level=info msg="CreateContainer within sandbox \"19fa667de9e658f8cca6d337ef1dcf9162d63eca7ffe44552fe8c619308244ef\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\"" Sep 9 00:15:22.783059 containerd[1638]: time="2025-09-09T00:15:22.782907081Z" level=info msg="StartContainer for \"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\"" Sep 9 00:15:22.787713 containerd[1638]: time="2025-09-09T00:15:22.787003213Z" level=info msg="connecting to shim 257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01" address="unix:///run/containerd/s/85103475c6e2931cd1353b9aea8c55d412a2f07292730bf40ee95abd3c3feab4" protocol=ttrpc version=3 Sep 9 00:15:23.002223 systemd[1]: Started cri-containerd-257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01.scope - libcontainer container 257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01. Sep 9 00:15:23.095757 containerd[1638]: time="2025-09-09T00:15:23.095721718Z" level=info msg="StartContainer for \"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" returns successfully" Sep 9 00:15:23.885545 containerd[1638]: time="2025-09-09T00:15:23.885490386Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" id:\"a6800391a01687c1d0d34bdc7d95bf33506aaf55aedf5351d9540a8e6e9dcfff\" pid:5157 exit_status:1 exited_at:{seconds:1757376923 nanos:798892169}" Sep 9 00:15:24.752528 containerd[1638]: time="2025-09-09T00:15:24.752497377Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" id:\"de5a384dc183d92f53ff12938fc215a3a5706a3c65bd7b221d168fbdfcda95d3\" pid:5189 exit_status:1 exited_at:{seconds:1757376924 nanos:752314325}" Sep 9 00:15:24.927855 containerd[1638]: time="2025-09-09T00:15:24.927312618Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:24.929434 containerd[1638]: time="2025-09-09T00:15:24.929374559Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 00:15:24.932032 containerd[1638]: time="2025-09-09T00:15:24.931891945Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:24.940242 containerd[1638]: time="2025-09-09T00:15:24.939720863Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:24.940242 containerd[1638]: time="2025-09-09T00:15:24.940162406Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.932211194s" Sep 9 00:15:24.940242 containerd[1638]: time="2025-09-09T00:15:24.940184372Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 00:15:24.949247 containerd[1638]: time="2025-09-09T00:15:24.940978881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 00:15:24.975831 containerd[1638]: time="2025-09-09T00:15:24.975800223Z" level=info msg="CreateContainer within sandbox \"c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 00:15:25.088130 containerd[1638]: time="2025-09-09T00:15:25.087499074Z" level=info msg="Container 7c17006bf30e7524b2bdb83f27bbec39b97cb33e5fa67b36e2c2dbab32f04275: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:25.139230 containerd[1638]: time="2025-09-09T00:15:25.139192535Z" level=info msg="CreateContainer within sandbox \"c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"7c17006bf30e7524b2bdb83f27bbec39b97cb33e5fa67b36e2c2dbab32f04275\"" Sep 9 00:15:25.140282 containerd[1638]: time="2025-09-09T00:15:25.140229963Z" level=info msg="StartContainer for \"7c17006bf30e7524b2bdb83f27bbec39b97cb33e5fa67b36e2c2dbab32f04275\"" Sep 9 00:15:25.150615 containerd[1638]: time="2025-09-09T00:15:25.141755138Z" level=info msg="connecting to shim 7c17006bf30e7524b2bdb83f27bbec39b97cb33e5fa67b36e2c2dbab32f04275" address="unix:///run/containerd/s/16724de8f8a7bf76bb400d1f708b044651da94dbd149504f8261ffdf50986e3e" protocol=ttrpc version=3 Sep 9 00:15:25.163342 systemd[1]: Started cri-containerd-7c17006bf30e7524b2bdb83f27bbec39b97cb33e5fa67b36e2c2dbab32f04275.scope - libcontainer container 7c17006bf30e7524b2bdb83f27bbec39b97cb33e5fa67b36e2c2dbab32f04275. Sep 9 00:15:25.198030 containerd[1638]: time="2025-09-09T00:15:25.198008638Z" level=info msg="StartContainer for \"7c17006bf30e7524b2bdb83f27bbec39b97cb33e5fa67b36e2c2dbab32f04275\" returns successfully" Sep 9 00:15:25.696642 containerd[1638]: time="2025-09-09T00:15:25.696618733Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" id:\"79a7e0f8792e774b00ae62fea75cce53f27eeefc173e466eea285bb788df363b\" pid:5240 exit_status:1 exited_at:{seconds:1757376925 nanos:696332241}" Sep 9 00:15:28.823141 containerd[1638]: time="2025-09-09T00:15:28.823026981Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:28.903934 containerd[1638]: time="2025-09-09T00:15:28.903880719Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 00:15:29.037627 containerd[1638]: time="2025-09-09T00:15:29.037569425Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:29.215171 containerd[1638]: time="2025-09-09T00:15:29.215023590Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:29.216874 containerd[1638]: time="2025-09-09T00:15:29.216022051Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.275027889s" Sep 9 00:15:29.216874 containerd[1638]: time="2025-09-09T00:15:29.216041198Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 00:15:30.368133 containerd[1638]: time="2025-09-09T00:15:30.366971140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 00:15:30.751912 kubelet[2924]: E0909 00:15:30.751788 2924 kubelet.go:2573] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.565s" Sep 9 00:15:31.405267 containerd[1638]: time="2025-09-09T00:15:31.405178861Z" level=info msg="CreateContainer within sandbox \"4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 00:15:31.413819 containerd[1638]: time="2025-09-09T00:15:31.413775072Z" level=info msg="Container f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:31.421783 containerd[1638]: time="2025-09-09T00:15:31.421715929Z" level=info msg="CreateContainer within sandbox \"4068fa249cd6e2d42c533d3765ab9df4f57b31ee60e507f17da21bd5b1e33833\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4\"" Sep 9 00:15:31.422795 containerd[1638]: time="2025-09-09T00:15:31.422767679Z" level=info msg="StartContainer for \"f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4\"" Sep 9 00:15:31.424925 containerd[1638]: time="2025-09-09T00:15:31.424888507Z" level=info msg="connecting to shim f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4" address="unix:///run/containerd/s/f82bbea83d044523bcf6261a896a43ca1006665db474e5088833b21836c0a058" protocol=ttrpc version=3 Sep 9 00:15:31.449287 systemd[1]: Started cri-containerd-f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4.scope - libcontainer container f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4. Sep 9 00:15:31.511531 containerd[1638]: time="2025-09-09T00:15:31.511502425Z" level=info msg="StartContainer for \"f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4\" returns successfully" Sep 9 00:15:31.770424 containerd[1638]: time="2025-09-09T00:15:31.770092195Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4\" id:\"c740ab673f0ff0e340930437f0bc5303b451fe228bb54cbbeab4980977bdcba4\" pid:5326 exited_at:{seconds:1757376931 nanos:740987460}" Sep 9 00:15:31.874954 kubelet[2924]: I0909 00:15:31.847645 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-bgldd" podStartSLOduration=41.069276819 podStartE2EDuration="51.845876736s" podCreationTimestamp="2025-09-09 00:14:40 +0000 UTC" firstStartedPulling="2025-09-09 00:15:11.039047736 +0000 UTC m=+48.317218486" lastFinishedPulling="2025-09-09 00:15:21.815647652 +0000 UTC m=+59.093818403" observedRunningTime="2025-09-09 00:15:23.701148584 +0000 UTC m=+60.979319336" watchObservedRunningTime="2025-09-09 00:15:31.845876736 +0000 UTC m=+69.124047484" Sep 9 00:15:31.879954 kubelet[2924]: I0909 00:15:31.879903 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5d75874cd-5q9th" podStartSLOduration=35.371543013 podStartE2EDuration="50.879888143s" podCreationTimestamp="2025-09-09 00:14:41 +0000 UTC" firstStartedPulling="2025-09-09 00:15:14.855494314 +0000 UTC m=+52.133665064" lastFinishedPulling="2025-09-09 00:15:30.363839441 +0000 UTC m=+67.642010194" observedRunningTime="2025-09-09 00:15:31.842566642 +0000 UTC m=+69.120737399" watchObservedRunningTime="2025-09-09 00:15:31.879888143 +0000 UTC m=+69.158058903" Sep 9 00:15:33.228660 containerd[1638]: time="2025-09-09T00:15:33.228180286Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:33.230109 containerd[1638]: time="2025-09-09T00:15:33.230078811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 00:15:33.230841 containerd[1638]: time="2025-09-09T00:15:33.230802697Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:33.232977 containerd[1638]: time="2025-09-09T00:15:33.232617762Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 00:15:33.232977 containerd[1638]: time="2025-09-09T00:15:33.232808863Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.86580697s" Sep 9 00:15:33.232977 containerd[1638]: time="2025-09-09T00:15:33.232822776Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 00:15:33.250993 containerd[1638]: time="2025-09-09T00:15:33.250482131Z" level=info msg="CreateContainer within sandbox \"c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 00:15:33.323889 containerd[1638]: time="2025-09-09T00:15:33.323861023Z" level=info msg="Container d50af8cd8f88f9b5f98451f3cdbeadef2933bfe8264666eb4de35f21d7e94ccb: CDI devices from CRI Config.CDIDevices: []" Sep 9 00:15:33.327548 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4020170307.mount: Deactivated successfully. Sep 9 00:15:33.402128 containerd[1638]: time="2025-09-09T00:15:33.401382595Z" level=info msg="CreateContainer within sandbox \"c50562b2ee1717606cec84033b00ba8d5aff3bfde1ce2189c10b0e33f90cb88e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d50af8cd8f88f9b5f98451f3cdbeadef2933bfe8264666eb4de35f21d7e94ccb\"" Sep 9 00:15:33.411493 containerd[1638]: time="2025-09-09T00:15:33.404405753Z" level=info msg="StartContainer for \"d50af8cd8f88f9b5f98451f3cdbeadef2933bfe8264666eb4de35f21d7e94ccb\"" Sep 9 00:15:33.411493 containerd[1638]: time="2025-09-09T00:15:33.405585492Z" level=info msg="connecting to shim d50af8cd8f88f9b5f98451f3cdbeadef2933bfe8264666eb4de35f21d7e94ccb" address="unix:///run/containerd/s/16724de8f8a7bf76bb400d1f708b044651da94dbd149504f8261ffdf50986e3e" protocol=ttrpc version=3 Sep 9 00:15:33.423274 systemd[1]: Started cri-containerd-d50af8cd8f88f9b5f98451f3cdbeadef2933bfe8264666eb4de35f21d7e94ccb.scope - libcontainer container d50af8cd8f88f9b5f98451f3cdbeadef2933bfe8264666eb4de35f21d7e94ccb. Sep 9 00:15:33.475935 containerd[1638]: time="2025-09-09T00:15:33.475907288Z" level=info msg="StartContainer for \"d50af8cd8f88f9b5f98451f3cdbeadef2933bfe8264666eb4de35f21d7e94ccb\" returns successfully" Sep 9 00:15:34.813149 kubelet[2924]: I0909 00:15:34.811871 2924 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 00:15:34.813149 kubelet[2924]: I0909 00:15:34.813100 2924 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 00:15:37.616705 containerd[1638]: time="2025-09-09T00:15:37.616662183Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb\" id:\"a70a338445c1da4df554e382924e987570980da3ab232efa6918b6449022b95b\" pid:5385 exited_at:{seconds:1757376937 nanos:616076208}" Sep 9 00:15:38.050487 kubelet[2924]: I0909 00:15:38.050431 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-j6h4t" podStartSLOduration=38.628008904 podStartE2EDuration="57.0212637s" podCreationTimestamp="2025-09-09 00:14:41 +0000 UTC" firstStartedPulling="2025-09-09 00:15:14.855045364 +0000 UTC m=+52.133216120" lastFinishedPulling="2025-09-09 00:15:33.248300168 +0000 UTC m=+70.526470916" observedRunningTime="2025-09-09 00:15:33.784065931 +0000 UTC m=+71.062236683" watchObservedRunningTime="2025-09-09 00:15:38.0212637 +0000 UTC m=+75.299434453" Sep 9 00:15:49.071371 containerd[1638]: time="2025-09-09T00:15:49.071299296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4\" id:\"1b955ec773fa2d67c547baa698c25473cbcdc4c3e1aec991568005fc1198a7f7\" pid:5433 exited_at:{seconds:1757376949 nanos:70950844}" Sep 9 00:15:51.214137 systemd[1]: Started sshd@7-139.178.70.104:22-139.178.68.195:42032.service - OpenSSH per-connection server daemon (139.178.68.195:42032). Sep 9 00:15:51.383755 sshd[5449]: Accepted publickey for core from 139.178.68.195 port 42032 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:15:51.403639 sshd-session[5449]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:15:51.428779 systemd-logind[1603]: New session 10 of user core. Sep 9 00:15:51.434289 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 00:15:52.876765 sshd[5451]: Connection closed by 139.178.68.195 port 42032 Sep 9 00:15:52.877416 sshd-session[5449]: pam_unix(sshd:session): session closed for user core Sep 9 00:15:52.903891 systemd[1]: sshd@7-139.178.70.104:22-139.178.68.195:42032.service: Deactivated successfully. Sep 9 00:15:52.908072 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 00:15:52.909324 systemd-logind[1603]: Session 10 logged out. Waiting for processes to exit. Sep 9 00:15:52.913777 systemd-logind[1603]: Removed session 10. Sep 9 00:15:55.158901 kubelet[2924]: I0909 00:15:55.158855 2924 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 00:15:56.836147 containerd[1638]: time="2025-09-09T00:15:56.834445611Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" id:\"5cbb273e2d9cf3d2969f65cb9280a613fc7242f4549732e2427e2b1ddf855bad\" pid:5483 exited_at:{seconds:1757376956 nanos:782995335}" Sep 9 00:15:58.002650 systemd[1]: Started sshd@8-139.178.70.104:22-139.178.68.195:42040.service - OpenSSH per-connection server daemon (139.178.68.195:42040). Sep 9 00:15:58.453054 sshd[5499]: Accepted publickey for core from 139.178.68.195 port 42040 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:15:58.455994 sshd-session[5499]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:15:58.463197 systemd-logind[1603]: New session 11 of user core. Sep 9 00:15:58.481315 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 00:15:59.398046 sshd[5501]: Connection closed by 139.178.68.195 port 42040 Sep 9 00:15:59.398779 sshd-session[5499]: pam_unix(sshd:session): session closed for user core Sep 9 00:15:59.402993 systemd-logind[1603]: Session 11 logged out. Waiting for processes to exit. Sep 9 00:15:59.403293 systemd[1]: sshd@8-139.178.70.104:22-139.178.68.195:42040.service: Deactivated successfully. Sep 9 00:15:59.405297 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 00:15:59.407922 systemd-logind[1603]: Removed session 11. Sep 9 00:16:01.758299 containerd[1638]: time="2025-09-09T00:16:01.758159018Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4\" id:\"fd3410e0eb927a69f8a57a47a14c56c6f080fef7f9c2a8d74b3041972722b75b\" pid:5556 exited_at:{seconds:1757376961 nanos:756669207}" Sep 9 00:16:01.837610 containerd[1638]: time="2025-09-09T00:16:01.837568802Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" id:\"f4239cbf4de5bc6daa33bb56ec91a54252f024fe582a0e02a1be9caf876b3fd4\" pid:5534 exited_at:{seconds:1757376961 nanos:837317418}" Sep 9 00:16:04.409404 systemd[1]: Started sshd@9-139.178.70.104:22-139.178.68.195:56480.service - OpenSSH per-connection server daemon (139.178.68.195:56480). Sep 9 00:16:04.799332 sshd[5568]: Accepted publickey for core from 139.178.68.195 port 56480 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:04.818449 sshd-session[5568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:04.823071 systemd-logind[1603]: New session 12 of user core. Sep 9 00:16:04.829422 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 00:16:07.887403 containerd[1638]: time="2025-09-09T00:16:07.887354045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb\" id:\"71b64940d455912faca01a576c75a9ec9b4e34711490e3680f41a0c99920719c\" pid:5588 exited_at:{seconds:1757376967 nanos:887143946}" Sep 9 00:16:08.101480 sshd[5570]: Connection closed by 139.178.68.195 port 56480 Sep 9 00:16:08.106503 sshd-session[5568]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:08.118561 systemd[1]: Started sshd@10-139.178.70.104:22-139.178.68.195:56488.service - OpenSSH per-connection server daemon (139.178.68.195:56488). Sep 9 00:16:08.118846 systemd[1]: sshd@9-139.178.70.104:22-139.178.68.195:56480.service: Deactivated successfully. Sep 9 00:16:08.121695 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 00:16:08.131975 systemd-logind[1603]: Session 12 logged out. Waiting for processes to exit. Sep 9 00:16:08.134839 systemd-logind[1603]: Removed session 12. Sep 9 00:16:08.202359 sshd[5604]: Accepted publickey for core from 139.178.68.195 port 56488 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:08.203647 sshd-session[5604]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:08.207000 systemd-logind[1603]: New session 13 of user core. Sep 9 00:16:08.212289 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 00:16:08.395811 sshd[5609]: Connection closed by 139.178.68.195 port 56488 Sep 9 00:16:08.395200 sshd-session[5604]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:08.404111 systemd[1]: sshd@10-139.178.70.104:22-139.178.68.195:56488.service: Deactivated successfully. Sep 9 00:16:08.406200 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 00:16:08.408202 systemd-logind[1603]: Session 13 logged out. Waiting for processes to exit. Sep 9 00:16:08.412019 systemd[1]: Started sshd@11-139.178.70.104:22-139.178.68.195:56490.service - OpenSSH per-connection server daemon (139.178.68.195:56490). Sep 9 00:16:08.413114 systemd-logind[1603]: Removed session 13. Sep 9 00:16:08.473425 sshd[5618]: Accepted publickey for core from 139.178.68.195 port 56490 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:08.474347 sshd-session[5618]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:08.478933 systemd-logind[1603]: New session 14 of user core. Sep 9 00:16:08.484522 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 00:16:08.616743 sshd[5620]: Connection closed by 139.178.68.195 port 56490 Sep 9 00:16:08.617881 sshd-session[5618]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:08.619922 systemd-logind[1603]: Session 14 logged out. Waiting for processes to exit. Sep 9 00:16:08.620099 systemd[1]: sshd@11-139.178.70.104:22-139.178.68.195:56490.service: Deactivated successfully. Sep 9 00:16:08.621576 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 00:16:08.624769 systemd-logind[1603]: Removed session 14. Sep 9 00:16:13.628164 systemd[1]: Started sshd@12-139.178.70.104:22-139.178.68.195:50256.service - OpenSSH per-connection server daemon (139.178.68.195:50256). Sep 9 00:16:14.144964 sshd[5631]: Accepted publickey for core from 139.178.68.195 port 50256 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:14.146930 sshd-session[5631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:14.154020 systemd-logind[1603]: New session 15 of user core. Sep 9 00:16:14.162223 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 00:16:14.774364 sshd[5633]: Connection closed by 139.178.68.195 port 50256 Sep 9 00:16:14.773444 sshd-session[5631]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:14.810384 systemd[1]: sshd@12-139.178.70.104:22-139.178.68.195:50256.service: Deactivated successfully. Sep 9 00:16:14.812473 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 00:16:14.814317 systemd-logind[1603]: Session 15 logged out. Waiting for processes to exit. Sep 9 00:16:14.815095 systemd-logind[1603]: Removed session 15. Sep 9 00:16:19.784813 systemd[1]: Started sshd@13-139.178.70.104:22-139.178.68.195:50270.service - OpenSSH per-connection server daemon (139.178.68.195:50270). Sep 9 00:16:20.131334 sshd[5651]: Accepted publickey for core from 139.178.68.195 port 50270 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:20.132488 sshd-session[5651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:20.147991 systemd-logind[1603]: New session 16 of user core. Sep 9 00:16:20.152452 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 00:16:21.213966 sshd[5653]: Connection closed by 139.178.68.195 port 50270 Sep 9 00:16:21.214245 sshd-session[5651]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:21.221799 systemd[1]: sshd@13-139.178.70.104:22-139.178.68.195:50270.service: Deactivated successfully. Sep 9 00:16:21.223846 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 00:16:21.224731 systemd-logind[1603]: Session 16 logged out. Waiting for processes to exit. Sep 9 00:16:21.226373 systemd-logind[1603]: Removed session 16. Sep 9 00:16:26.227366 systemd[1]: Started sshd@14-139.178.70.104:22-139.178.68.195:34142.service - OpenSSH per-connection server daemon (139.178.68.195:34142). Sep 9 00:16:26.572710 sshd[5696]: Accepted publickey for core from 139.178.68.195 port 34142 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:26.573721 sshd-session[5696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:26.577031 systemd-logind[1603]: New session 17 of user core. Sep 9 00:16:26.582275 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 00:16:28.075628 containerd[1638]: time="2025-09-09T00:16:28.053210296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" id:\"75e8e7df5f172cdd20a5c08177a4cdc5b2ae0b5ccf2e5a5d2627331fa38154c7\" pid:5681 exited_at:{seconds:1757376987 nanos:956170351}" Sep 9 00:16:31.975615 containerd[1638]: time="2025-09-09T00:16:31.975581837Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4\" id:\"8e4a1a8ef09353a904cf82a3f9f6ee34d3e23b3fe8151757c988a620ff9bd296\" pid:5745 exited_at:{seconds:1757376991 nanos:975306496}" Sep 9 00:16:32.144293 sshd[5698]: Connection closed by 139.178.68.195 port 34142 Sep 9 00:16:32.149889 sshd-session[5696]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:32.157314 systemd[1]: sshd@14-139.178.70.104:22-139.178.68.195:34142.service: Deactivated successfully. Sep 9 00:16:32.159400 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 00:16:32.164245 systemd-logind[1603]: Session 17 logged out. Waiting for processes to exit. Sep 9 00:16:32.166557 systemd[1]: Started sshd@15-139.178.70.104:22-139.178.68.195:36908.service - OpenSSH per-connection server daemon (139.178.68.195:36908). Sep 9 00:16:32.168142 systemd-logind[1603]: Removed session 17. Sep 9 00:16:32.449748 sshd[5759]: Accepted publickey for core from 139.178.68.195 port 36908 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:32.451025 sshd-session[5759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:32.457171 systemd-logind[1603]: New session 18 of user core. Sep 9 00:16:32.464290 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 00:16:34.453947 sshd[5761]: Connection closed by 139.178.68.195 port 36908 Sep 9 00:16:34.454907 sshd-session[5759]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:34.463558 systemd[1]: sshd@15-139.178.70.104:22-139.178.68.195:36908.service: Deactivated successfully. Sep 9 00:16:34.465267 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 00:16:34.466094 systemd-logind[1603]: Session 18 logged out. Waiting for processes to exit. Sep 9 00:16:34.469043 systemd[1]: Started sshd@16-139.178.70.104:22-139.178.68.195:36912.service - OpenSSH per-connection server daemon (139.178.68.195:36912). Sep 9 00:16:34.469985 systemd-logind[1603]: Removed session 18. Sep 9 00:16:34.593529 sshd[5774]: Accepted publickey for core from 139.178.68.195 port 36912 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:34.595211 sshd-session[5774]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:34.599460 systemd-logind[1603]: New session 19 of user core. Sep 9 00:16:34.604349 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 00:16:35.638995 sshd[5776]: Connection closed by 139.178.68.195 port 36912 Sep 9 00:16:35.647424 systemd[1]: Started sshd@17-139.178.70.104:22-139.178.68.195:36916.service - OpenSSH per-connection server daemon (139.178.68.195:36916). Sep 9 00:16:35.709824 sshd-session[5774]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:35.725632 systemd[1]: sshd@16-139.178.70.104:22-139.178.68.195:36912.service: Deactivated successfully. Sep 9 00:16:35.730723 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 00:16:35.737982 systemd-logind[1603]: Session 19 logged out. Waiting for processes to exit. Sep 9 00:16:35.740539 systemd-logind[1603]: Removed session 19. Sep 9 00:16:35.843669 sshd[5810]: Accepted publickey for core from 139.178.68.195 port 36916 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:35.845880 sshd-session[5810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:35.856901 systemd-logind[1603]: New session 20 of user core. Sep 9 00:16:35.864394 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 00:16:37.084342 containerd[1638]: time="2025-09-09T00:16:37.084054681Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed856c5278975b91878aa1a3763113b293eff4aeae425249cee0bb1fe45ef8eb\" id:\"5afe3208b3b7615f4350b3ff61f36df3bbfd2b8cf36d1b6dd01d2d6e91b5696c\" pid:5798 exited_at:{seconds:1757376996 nanos:989855941}" Sep 9 00:16:40.482929 sshd[5818]: Connection closed by 139.178.68.195 port 36916 Sep 9 00:16:40.507473 sshd-session[5810]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:40.551720 systemd[1]: sshd@17-139.178.70.104:22-139.178.68.195:36916.service: Deactivated successfully. Sep 9 00:16:40.553525 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 00:16:40.553798 systemd[1]: session-20.scope: Consumed 605ms CPU time, 64.9M memory peak. Sep 9 00:16:40.554983 systemd-logind[1603]: Session 20 logged out. Waiting for processes to exit. Sep 9 00:16:40.571554 systemd[1]: Started sshd@18-139.178.70.104:22-139.178.68.195:42322.service - OpenSSH per-connection server daemon (139.178.68.195:42322). Sep 9 00:16:40.585983 systemd-logind[1603]: Removed session 20. Sep 9 00:16:41.036271 sshd[5856]: Accepted publickey for core from 139.178.68.195 port 42322 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:41.037740 sshd-session[5856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:41.042457 systemd-logind[1603]: New session 21 of user core. Sep 9 00:16:41.046906 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 00:16:41.472203 sshd[5865]: Connection closed by 139.178.68.195 port 42322 Sep 9 00:16:41.472493 sshd-session[5856]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:41.477919 systemd-logind[1603]: Session 21 logged out. Waiting for processes to exit. Sep 9 00:16:41.478959 systemd[1]: sshd@18-139.178.70.104:22-139.178.68.195:42322.service: Deactivated successfully. Sep 9 00:16:41.480537 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 00:16:41.482528 systemd-logind[1603]: Removed session 21. Sep 9 00:16:46.486578 systemd[1]: Started sshd@19-139.178.70.104:22-139.178.68.195:42326.service - OpenSSH per-connection server daemon (139.178.68.195:42326). Sep 9 00:16:46.630375 sshd[5879]: Accepted publickey for core from 139.178.68.195 port 42326 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:46.631981 sshd-session[5879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:46.635325 systemd-logind[1603]: New session 22 of user core. Sep 9 00:16:46.646394 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 00:16:46.989470 sshd[5881]: Connection closed by 139.178.68.195 port 42326 Sep 9 00:16:46.988729 sshd-session[5879]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:46.991311 systemd[1]: sshd@19-139.178.70.104:22-139.178.68.195:42326.service: Deactivated successfully. Sep 9 00:16:46.992557 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 00:16:46.994100 systemd-logind[1603]: Session 22 logged out. Waiting for processes to exit. Sep 9 00:16:46.997409 systemd-logind[1603]: Removed session 22. Sep 9 00:16:49.087210 containerd[1638]: time="2025-09-09T00:16:49.085146126Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f81404de208a4184f8195d9aa6fb30d8f420bb9fa9471d8b4c0a03c1492183c4\" id:\"41188c136b56d5987084724dc5f9532dd59306d1c670d78785204529ffa72de5\" pid:5906 exited_at:{seconds:1757377009 nanos:69939262}" Sep 9 00:16:52.000592 systemd[1]: Started sshd@20-139.178.70.104:22-139.178.68.195:39166.service - OpenSSH per-connection server daemon (139.178.68.195:39166). Sep 9 00:16:52.101144 sshd[5916]: Accepted publickey for core from 139.178.68.195 port 39166 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:52.103472 sshd-session[5916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:52.108612 systemd-logind[1603]: New session 23 of user core. Sep 9 00:16:52.114511 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 00:16:52.754058 sshd[5918]: Connection closed by 139.178.68.195 port 39166 Sep 9 00:16:52.754290 sshd-session[5916]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:52.759606 systemd[1]: sshd@20-139.178.70.104:22-139.178.68.195:39166.service: Deactivated successfully. Sep 9 00:16:52.761723 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 00:16:52.763871 systemd-logind[1603]: Session 23 logged out. Waiting for processes to exit. Sep 9 00:16:52.765835 systemd-logind[1603]: Removed session 23. Sep 9 00:16:56.442001 containerd[1638]: time="2025-09-09T00:16:56.441960187Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" id:\"20655fb4a642ea8ec3adb642afefc09089d368c96e0d2c05217be62b53609fb2\" pid:5941 exited_at:{seconds:1757377016 nanos:441490503}" Sep 9 00:16:57.767659 systemd[1]: Started sshd@21-139.178.70.104:22-139.178.68.195:39180.service - OpenSSH per-connection server daemon (139.178.68.195:39180). Sep 9 00:16:58.010838 sshd[5955]: Accepted publickey for core from 139.178.68.195 port 39180 ssh2: RSA SHA256:VfV4DbcB1YJ5ML+Hb+wSNrAGdGs+bVUt3FrVVQ/IlNk Sep 9 00:16:58.012271 sshd-session[5955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 00:16:58.016803 systemd-logind[1603]: New session 24 of user core. Sep 9 00:16:58.023297 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 00:16:59.065526 sshd[5957]: Connection closed by 139.178.68.195 port 39180 Sep 9 00:16:59.065727 sshd-session[5955]: pam_unix(sshd:session): session closed for user core Sep 9 00:16:59.068509 systemd-logind[1603]: Session 24 logged out. Waiting for processes to exit. Sep 9 00:16:59.068886 systemd[1]: sshd@21-139.178.70.104:22-139.178.68.195:39180.service: Deactivated successfully. Sep 9 00:16:59.070885 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 00:16:59.072417 systemd-logind[1603]: Removed session 24. Sep 9 00:17:01.366283 containerd[1638]: time="2025-09-09T00:17:01.366222988Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257f3cc7ef5bf930ca84e2c4921dfb42f0457540dbd3d6e0c6aab76d2bd4ed01\" id:\"631ce3df39eeed6ff1bdfc079226574cd66d1dd6fb70f77eb448d94e053c3497\" pid:5982 exited_at:{seconds:1757377021 nanos:365812051}"