Jul 9 12:59:27.703464 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed Jul 9 08:38:39 -00 2025 Jul 9 12:59:27.703480 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f85d3be94c634d7d72fbcd0e670073ce56ae2e0cc763f83b329300b7cea5203d Jul 9 12:59:27.703487 kernel: Disabled fast string operations Jul 9 12:59:27.703491 kernel: BIOS-provided physical RAM map: Jul 9 12:59:27.703495 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jul 9 12:59:27.703499 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jul 9 12:59:27.703505 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jul 9 12:59:27.703510 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jul 9 12:59:27.703514 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jul 9 12:59:27.703518 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jul 9 12:59:27.703523 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jul 9 12:59:27.703527 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jul 9 12:59:27.703531 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jul 9 12:59:27.703535 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jul 9 12:59:27.703542 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jul 9 12:59:27.703546 kernel: NX (Execute Disable) protection: active Jul 9 12:59:27.703551 kernel: APIC: Static calls initialized Jul 9 12:59:27.703556 kernel: SMBIOS 2.7 present. Jul 9 12:59:27.703561 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jul 9 12:59:27.703566 kernel: DMI: Memory slots populated: 1/128 Jul 9 12:59:27.703572 kernel: vmware: hypercall mode: 0x00 Jul 9 12:59:27.703577 kernel: Hypervisor detected: VMware Jul 9 12:59:27.703581 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jul 9 12:59:27.703586 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jul 9 12:59:27.703591 kernel: vmware: using clock offset of 3895951237 ns Jul 9 12:59:27.703596 kernel: tsc: Detected 3408.000 MHz processor Jul 9 12:59:27.703601 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jul 9 12:59:27.703611 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jul 9 12:59:27.703616 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jul 9 12:59:27.703621 kernel: total RAM covered: 3072M Jul 9 12:59:27.703627 kernel: Found optimal setting for mtrr clean up Jul 9 12:59:27.703633 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jul 9 12:59:27.703638 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jul 9 12:59:27.703643 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jul 9 12:59:27.703648 kernel: Using GB pages for direct mapping Jul 9 12:59:27.703653 kernel: ACPI: Early table checksum verification disabled Jul 9 12:59:27.703657 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jul 9 12:59:27.703662 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jul 9 12:59:27.703667 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jul 9 12:59:27.703673 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jul 9 12:59:27.703680 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jul 9 12:59:27.703685 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jul 9 12:59:27.703690 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jul 9 12:59:27.703695 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jul 9 12:59:27.703702 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jul 9 12:59:27.703707 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jul 9 12:59:27.703712 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jul 9 12:59:27.703717 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jul 9 12:59:27.703723 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jul 9 12:59:27.703728 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jul 9 12:59:27.703733 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jul 9 12:59:27.703738 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jul 9 12:59:27.703743 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jul 9 12:59:27.703748 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jul 9 12:59:27.703755 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jul 9 12:59:27.703760 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jul 9 12:59:27.703765 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jul 9 12:59:27.703770 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jul 9 12:59:27.703775 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jul 9 12:59:27.703780 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jul 9 12:59:27.703785 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jul 9 12:59:27.703791 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Jul 9 12:59:27.703796 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Jul 9 12:59:27.703802 kernel: Zone ranges: Jul 9 12:59:27.703808 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jul 9 12:59:27.703813 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jul 9 12:59:27.703818 kernel: Normal empty Jul 9 12:59:27.703823 kernel: Device empty Jul 9 12:59:27.703828 kernel: Movable zone start for each node Jul 9 12:59:27.703833 kernel: Early memory node ranges Jul 9 12:59:27.703838 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jul 9 12:59:27.703843 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jul 9 12:59:27.703848 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jul 9 12:59:27.703854 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jul 9 12:59:27.703860 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jul 9 12:59:27.703865 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jul 9 12:59:27.703870 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jul 9 12:59:27.703875 kernel: ACPI: PM-Timer IO Port: 0x1008 Jul 9 12:59:27.703880 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jul 9 12:59:27.703885 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jul 9 12:59:27.703890 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jul 9 12:59:27.703895 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jul 9 12:59:27.703902 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jul 9 12:59:27.703907 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jul 9 12:59:27.703912 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jul 9 12:59:27.703917 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jul 9 12:59:27.703922 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jul 9 12:59:27.703927 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jul 9 12:59:27.703932 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jul 9 12:59:27.703937 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jul 9 12:59:27.703942 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jul 9 12:59:27.703947 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jul 9 12:59:27.703953 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jul 9 12:59:27.703958 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jul 9 12:59:27.703963 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jul 9 12:59:27.703968 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jul 9 12:59:27.703973 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jul 9 12:59:27.703978 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jul 9 12:59:27.703983 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jul 9 12:59:27.703988 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jul 9 12:59:27.703993 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jul 9 12:59:27.703998 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jul 9 12:59:27.704004 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jul 9 12:59:27.704009 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jul 9 12:59:27.704014 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jul 9 12:59:27.704019 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jul 9 12:59:27.704024 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jul 9 12:59:27.704030 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jul 9 12:59:27.704035 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jul 9 12:59:27.704040 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jul 9 12:59:27.704045 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jul 9 12:59:27.704050 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jul 9 12:59:27.704056 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jul 9 12:59:27.704061 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jul 9 12:59:27.704066 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jul 9 12:59:27.704071 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jul 9 12:59:27.704076 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jul 9 12:59:27.704082 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jul 9 12:59:27.704090 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jul 9 12:59:27.704097 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jul 9 12:59:27.704102 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jul 9 12:59:27.704108 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jul 9 12:59:27.704114 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jul 9 12:59:27.704119 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jul 9 12:59:27.704125 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jul 9 12:59:27.704130 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jul 9 12:59:27.704136 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jul 9 12:59:27.704141 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jul 9 12:59:27.704146 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jul 9 12:59:27.704153 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jul 9 12:59:27.704158 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jul 9 12:59:27.704164 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jul 9 12:59:27.704169 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jul 9 12:59:27.704175 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jul 9 12:59:27.704180 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jul 9 12:59:27.704185 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jul 9 12:59:27.704190 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jul 9 12:59:27.704196 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jul 9 12:59:27.704201 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jul 9 12:59:27.704208 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jul 9 12:59:27.704213 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jul 9 12:59:27.704219 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jul 9 12:59:27.704224 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jul 9 12:59:27.704230 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jul 9 12:59:27.704235 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jul 9 12:59:27.704240 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jul 9 12:59:27.704246 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jul 9 12:59:27.704251 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jul 9 12:59:27.704257 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jul 9 12:59:27.704263 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jul 9 12:59:27.704271 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jul 9 12:59:27.704278 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jul 9 12:59:27.704283 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jul 9 12:59:27.704288 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jul 9 12:59:27.704294 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jul 9 12:59:27.704299 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jul 9 12:59:27.704382 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jul 9 12:59:27.704388 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jul 9 12:59:27.704394 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jul 9 12:59:27.704401 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jul 9 12:59:27.704440 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jul 9 12:59:27.704445 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jul 9 12:59:27.704456 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jul 9 12:59:27.704462 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jul 9 12:59:27.704468 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jul 9 12:59:27.704473 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jul 9 12:59:27.704478 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jul 9 12:59:27.704489 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jul 9 12:59:27.704495 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jul 9 12:59:27.704503 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jul 9 12:59:27.704508 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jul 9 12:59:27.704513 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jul 9 12:59:27.704522 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jul 9 12:59:27.704528 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jul 9 12:59:27.704533 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jul 9 12:59:27.704538 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jul 9 12:59:27.704544 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jul 9 12:59:27.704552 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jul 9 12:59:27.704562 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jul 9 12:59:27.704567 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jul 9 12:59:27.704572 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jul 9 12:59:27.704578 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jul 9 12:59:27.704588 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jul 9 12:59:27.704594 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jul 9 12:59:27.704600 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jul 9 12:59:27.704605 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jul 9 12:59:27.704610 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jul 9 12:59:27.704621 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jul 9 12:59:27.704629 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jul 9 12:59:27.704634 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jul 9 12:59:27.704639 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jul 9 12:59:27.704645 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jul 9 12:59:27.704650 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jul 9 12:59:27.704656 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jul 9 12:59:27.704661 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jul 9 12:59:27.704666 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jul 9 12:59:27.704672 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jul 9 12:59:27.704677 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jul 9 12:59:27.704683 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jul 9 12:59:27.704689 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jul 9 12:59:27.704694 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jul 9 12:59:27.704700 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jul 9 12:59:27.704705 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jul 9 12:59:27.704711 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jul 9 12:59:27.704716 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jul 9 12:59:27.704721 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jul 9 12:59:27.704727 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jul 9 12:59:27.704732 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jul 9 12:59:27.704739 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jul 9 12:59:27.704744 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jul 9 12:59:27.704750 kernel: TSC deadline timer available Jul 9 12:59:27.704756 kernel: CPU topo: Max. logical packages: 128 Jul 9 12:59:27.704761 kernel: CPU topo: Max. logical dies: 128 Jul 9 12:59:27.704766 kernel: CPU topo: Max. dies per package: 1 Jul 9 12:59:27.704772 kernel: CPU topo: Max. threads per core: 1 Jul 9 12:59:27.704777 kernel: CPU topo: Num. cores per package: 1 Jul 9 12:59:27.704783 kernel: CPU topo: Num. threads per package: 1 Jul 9 12:59:27.704789 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Jul 9 12:59:27.704794 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jul 9 12:59:27.704800 kernel: Booting paravirtualized kernel on VMware hypervisor Jul 9 12:59:27.704805 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jul 9 12:59:27.704811 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jul 9 12:59:27.704817 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Jul 9 12:59:27.704822 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Jul 9 12:59:27.704828 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jul 9 12:59:27.704833 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jul 9 12:59:27.704839 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jul 9 12:59:27.704845 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jul 9 12:59:27.704850 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jul 9 12:59:27.704856 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jul 9 12:59:27.704861 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jul 9 12:59:27.704866 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jul 9 12:59:27.704872 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jul 9 12:59:27.704877 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jul 9 12:59:27.704883 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jul 9 12:59:27.704889 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jul 9 12:59:27.704895 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jul 9 12:59:27.704900 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jul 9 12:59:27.704906 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jul 9 12:59:27.704914 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jul 9 12:59:27.704921 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f85d3be94c634d7d72fbcd0e670073ce56ae2e0cc763f83b329300b7cea5203d Jul 9 12:59:27.704927 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 9 12:59:27.704932 kernel: random: crng init done Jul 9 12:59:27.704939 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jul 9 12:59:27.704945 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jul 9 12:59:27.704950 kernel: printk: log_buf_len min size: 262144 bytes Jul 9 12:59:27.704956 kernel: printk: log_buf_len: 1048576 bytes Jul 9 12:59:27.704961 kernel: printk: early log buf free: 245592(93%) Jul 9 12:59:27.704967 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 9 12:59:27.704972 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jul 9 12:59:27.704978 kernel: Fallback order for Node 0: 0 Jul 9 12:59:27.704983 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Jul 9 12:59:27.704990 kernel: Policy zone: DMA32 Jul 9 12:59:27.704995 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 9 12:59:27.705001 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jul 9 12:59:27.705006 kernel: ftrace: allocating 40097 entries in 157 pages Jul 9 12:59:27.705012 kernel: ftrace: allocated 157 pages with 5 groups Jul 9 12:59:27.705017 kernel: Dynamic Preempt: voluntary Jul 9 12:59:27.705023 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 9 12:59:27.705028 kernel: rcu: RCU event tracing is enabled. Jul 9 12:59:27.705034 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jul 9 12:59:27.705045 kernel: Trampoline variant of Tasks RCU enabled. Jul 9 12:59:27.705050 kernel: Rude variant of Tasks RCU enabled. Jul 9 12:59:27.705056 kernel: Tracing variant of Tasks RCU enabled. Jul 9 12:59:27.705061 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 9 12:59:27.705067 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jul 9 12:59:27.705072 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jul 9 12:59:27.705078 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jul 9 12:59:27.705083 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jul 9 12:59:27.705089 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jul 9 12:59:27.705096 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jul 9 12:59:27.705102 kernel: Console: colour VGA+ 80x25 Jul 9 12:59:27.705107 kernel: printk: legacy console [tty0] enabled Jul 9 12:59:27.705113 kernel: printk: legacy console [ttyS0] enabled Jul 9 12:59:27.705118 kernel: ACPI: Core revision 20240827 Jul 9 12:59:27.705124 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jul 9 12:59:27.705129 kernel: APIC: Switch to symmetric I/O mode setup Jul 9 12:59:27.705135 kernel: x2apic enabled Jul 9 12:59:27.705140 kernel: APIC: Switched APIC routing to: physical x2apic Jul 9 12:59:27.705146 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jul 9 12:59:27.705153 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jul 9 12:59:27.705158 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jul 9 12:59:27.705164 kernel: Disabled fast string operations Jul 9 12:59:27.705169 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jul 9 12:59:27.705175 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Jul 9 12:59:27.705180 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jul 9 12:59:27.705192 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jul 9 12:59:27.705200 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jul 9 12:59:27.705205 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jul 9 12:59:27.705213 kernel: RETBleed: Mitigation: Enhanced IBRS Jul 9 12:59:27.705218 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jul 9 12:59:27.705229 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jul 9 12:59:27.705235 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jul 9 12:59:27.705241 kernel: SRBDS: Unknown: Dependent on hypervisor status Jul 9 12:59:27.705246 kernel: GDS: Unknown: Dependent on hypervisor status Jul 9 12:59:27.705252 kernel: ITS: Mitigation: Aligned branch/return thunks Jul 9 12:59:27.705257 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jul 9 12:59:27.705264 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jul 9 12:59:27.705270 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jul 9 12:59:27.705276 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jul 9 12:59:27.705281 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jul 9 12:59:27.705287 kernel: Freeing SMP alternatives memory: 32K Jul 9 12:59:27.705292 kernel: pid_max: default: 131072 minimum: 1024 Jul 9 12:59:27.705298 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 9 12:59:27.705316 kernel: landlock: Up and running. Jul 9 12:59:27.705321 kernel: SELinux: Initializing. Jul 9 12:59:27.705329 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 9 12:59:27.705334 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 9 12:59:27.705340 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jul 9 12:59:27.705345 kernel: Performance Events: Skylake events, core PMU driver. Jul 9 12:59:27.705351 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jul 9 12:59:27.705356 kernel: core: CPUID marked event: 'instructions' unavailable Jul 9 12:59:27.705362 kernel: core: CPUID marked event: 'bus cycles' unavailable Jul 9 12:59:27.705367 kernel: core: CPUID marked event: 'cache references' unavailable Jul 9 12:59:27.705372 kernel: core: CPUID marked event: 'cache misses' unavailable Jul 9 12:59:27.705379 kernel: core: CPUID marked event: 'branch instructions' unavailable Jul 9 12:59:27.705384 kernel: core: CPUID marked event: 'branch misses' unavailable Jul 9 12:59:27.705390 kernel: ... version: 1 Jul 9 12:59:27.705395 kernel: ... bit width: 48 Jul 9 12:59:27.705401 kernel: ... generic registers: 4 Jul 9 12:59:27.705406 kernel: ... value mask: 0000ffffffffffff Jul 9 12:59:27.705412 kernel: ... max period: 000000007fffffff Jul 9 12:59:27.705417 kernel: ... fixed-purpose events: 0 Jul 9 12:59:27.705423 kernel: ... event mask: 000000000000000f Jul 9 12:59:27.705429 kernel: signal: max sigframe size: 1776 Jul 9 12:59:27.705435 kernel: rcu: Hierarchical SRCU implementation. Jul 9 12:59:27.705441 kernel: rcu: Max phase no-delay instances is 400. Jul 9 12:59:27.705446 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Jul 9 12:59:27.705452 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jul 9 12:59:27.705457 kernel: smp: Bringing up secondary CPUs ... Jul 9 12:59:27.705463 kernel: smpboot: x86: Booting SMP configuration: Jul 9 12:59:27.705468 kernel: .... node #0, CPUs: #1 Jul 9 12:59:27.705474 kernel: Disabled fast string operations Jul 9 12:59:27.705480 kernel: smp: Brought up 1 node, 2 CPUs Jul 9 12:59:27.705485 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jul 9 12:59:27.705491 kernel: Memory: 1924256K/2096628K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54568K init, 2400K bss, 160988K reserved, 0K cma-reserved) Jul 9 12:59:27.705497 kernel: devtmpfs: initialized Jul 9 12:59:27.705502 kernel: x86/mm: Memory block size: 128MB Jul 9 12:59:27.705508 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jul 9 12:59:27.705514 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 9 12:59:27.705520 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jul 9 12:59:27.705525 kernel: pinctrl core: initialized pinctrl subsystem Jul 9 12:59:27.705532 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 9 12:59:27.705538 kernel: audit: initializing netlink subsys (disabled) Jul 9 12:59:27.705543 kernel: audit: type=2000 audit(1752065964.288:1): state=initialized audit_enabled=0 res=1 Jul 9 12:59:27.705549 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 9 12:59:27.705554 kernel: thermal_sys: Registered thermal governor 'user_space' Jul 9 12:59:27.705560 kernel: cpuidle: using governor menu Jul 9 12:59:27.705565 kernel: Simple Boot Flag at 0x36 set to 0x80 Jul 9 12:59:27.705571 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 9 12:59:27.705576 kernel: dca service started, version 1.12.1 Jul 9 12:59:27.705583 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Jul 9 12:59:27.705595 kernel: PCI: Using configuration type 1 for base access Jul 9 12:59:27.705602 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jul 9 12:59:27.705608 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 9 12:59:27.705614 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jul 9 12:59:27.705620 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 9 12:59:27.705625 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jul 9 12:59:27.705631 kernel: ACPI: Added _OSI(Module Device) Jul 9 12:59:27.705637 kernel: ACPI: Added _OSI(Processor Device) Jul 9 12:59:27.705644 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 9 12:59:27.705649 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 9 12:59:27.705655 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jul 9 12:59:27.705661 kernel: ACPI: Interpreter enabled Jul 9 12:59:27.705667 kernel: ACPI: PM: (supports S0 S1 S5) Jul 9 12:59:27.705672 kernel: ACPI: Using IOAPIC for interrupt routing Jul 9 12:59:27.705678 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jul 9 12:59:27.705684 kernel: PCI: Using E820 reservations for host bridge windows Jul 9 12:59:27.705690 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jul 9 12:59:27.705697 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jul 9 12:59:27.706596 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 9 12:59:27.706663 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jul 9 12:59:27.706714 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jul 9 12:59:27.706723 kernel: PCI host bridge to bus 0000:00 Jul 9 12:59:27.706776 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jul 9 12:59:27.706821 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jul 9 12:59:27.706868 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jul 9 12:59:27.706911 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jul 9 12:59:27.706954 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jul 9 12:59:27.706997 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jul 9 12:59:27.709187 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Jul 9 12:59:27.709257 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Jul 9 12:59:27.709322 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jul 9 12:59:27.709380 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Jul 9 12:59:27.709435 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Jul 9 12:59:27.709487 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Jul 9 12:59:27.709538 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Jul 9 12:59:27.709587 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Jul 9 12:59:27.709636 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Jul 9 12:59:27.709684 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Jul 9 12:59:27.709753 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Jul 9 12:59:27.709815 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jul 9 12:59:27.709867 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jul 9 12:59:27.709938 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Jul 9 12:59:27.710424 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Jul 9 12:59:27.710489 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Jul 9 12:59:27.710548 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Jul 9 12:59:27.710599 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Jul 9 12:59:27.710649 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Jul 9 12:59:27.710702 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Jul 9 12:59:27.710751 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Jul 9 12:59:27.710798 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jul 9 12:59:27.710851 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Jul 9 12:59:27.710900 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jul 9 12:59:27.710953 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jul 9 12:59:27.711008 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jul 9 12:59:27.711060 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jul 9 12:59:27.713329 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.713403 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jul 9 12:59:27.713459 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jul 9 12:59:27.713511 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jul 9 12:59:27.713561 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.713616 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.713671 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jul 9 12:59:27.713721 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jul 9 12:59:27.713770 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jul 9 12:59:27.713819 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jul 9 12:59:27.713868 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.713924 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.713974 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jul 9 12:59:27.714027 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jul 9 12:59:27.714076 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jul 9 12:59:27.714124 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jul 9 12:59:27.714173 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.714227 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.714277 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jul 9 12:59:27.714344 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jul 9 12:59:27.714395 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jul 9 12:59:27.714445 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.714499 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.714550 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jul 9 12:59:27.714599 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jul 9 12:59:27.714648 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jul 9 12:59:27.714700 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.714754 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.714803 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jul 9 12:59:27.714852 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jul 9 12:59:27.714902 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jul 9 12:59:27.714951 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.715004 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.715058 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jul 9 12:59:27.715107 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jul 9 12:59:27.715156 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jul 9 12:59:27.715205 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.715259 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.716104 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jul 9 12:59:27.716167 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jul 9 12:59:27.716220 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jul 9 12:59:27.716274 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.716341 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.716402 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jul 9 12:59:27.716482 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jul 9 12:59:27.716535 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jul 9 12:59:27.716584 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.716638 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.716692 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jul 9 12:59:27.716741 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jul 9 12:59:27.716790 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jul 9 12:59:27.716839 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jul 9 12:59:27.716889 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.716944 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.716993 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jul 9 12:59:27.717045 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jul 9 12:59:27.717094 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jul 9 12:59:27.717144 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jul 9 12:59:27.717192 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.717250 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.719322 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jul 9 12:59:27.719387 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jul 9 12:59:27.719443 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jul 9 12:59:27.719494 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.719549 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.719600 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jul 9 12:59:27.719649 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jul 9 12:59:27.719698 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jul 9 12:59:27.719747 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.719803 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.719853 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jul 9 12:59:27.719902 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jul 9 12:59:27.719951 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jul 9 12:59:27.720000 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.720054 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.720105 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jul 9 12:59:27.720157 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jul 9 12:59:27.720206 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jul 9 12:59:27.720255 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.722328 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.722385 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jul 9 12:59:27.722435 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jul 9 12:59:27.722484 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jul 9 12:59:27.722533 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.722590 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.722639 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jul 9 12:59:27.722688 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jul 9 12:59:27.722737 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jul 9 12:59:27.722787 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jul 9 12:59:27.722842 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.722895 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.722954 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jul 9 12:59:27.723003 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jul 9 12:59:27.723052 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jul 9 12:59:27.723103 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jul 9 12:59:27.723151 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.723205 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.723256 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jul 9 12:59:27.723313 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jul 9 12:59:27.723368 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jul 9 12:59:27.723417 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jul 9 12:59:27.723465 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.723522 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.723573 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jul 9 12:59:27.723629 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jul 9 12:59:27.723687 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jul 9 12:59:27.723753 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.723819 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.723882 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jul 9 12:59:27.723946 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jul 9 12:59:27.724013 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jul 9 12:59:27.724078 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.724150 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.724201 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jul 9 12:59:27.724251 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jul 9 12:59:27.726290 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jul 9 12:59:27.726361 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.726417 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.726469 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jul 9 12:59:27.726519 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jul 9 12:59:27.726568 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jul 9 12:59:27.726616 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.726670 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.726724 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jul 9 12:59:27.726774 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jul 9 12:59:27.726822 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jul 9 12:59:27.726876 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.726932 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.726982 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jul 9 12:59:27.727031 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jul 9 12:59:27.727083 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jul 9 12:59:27.727131 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jul 9 12:59:27.727180 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.727233 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.727284 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jul 9 12:59:27.727400 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jul 9 12:59:27.727654 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jul 9 12:59:27.727727 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jul 9 12:59:27.728037 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.728105 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.728157 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jul 9 12:59:27.728207 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jul 9 12:59:27.728256 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jul 9 12:59:27.728313 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.728371 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.728422 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jul 9 12:59:27.728471 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jul 9 12:59:27.728520 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jul 9 12:59:27.728569 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.728624 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.728675 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jul 9 12:59:27.728728 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jul 9 12:59:27.728777 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jul 9 12:59:27.728826 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.728879 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.728928 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jul 9 12:59:27.728977 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jul 9 12:59:27.729026 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jul 9 12:59:27.729078 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.729132 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.729182 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jul 9 12:59:27.729231 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jul 9 12:59:27.729280 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jul 9 12:59:27.729342 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.729397 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 9 12:59:27.729710 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jul 9 12:59:27.729770 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jul 9 12:59:27.729827 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jul 9 12:59:27.729877 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.729935 kernel: pci_bus 0000:01: extended config space not accessible Jul 9 12:59:27.729989 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jul 9 12:59:27.730044 kernel: pci_bus 0000:02: extended config space not accessible Jul 9 12:59:27.730053 kernel: acpiphp: Slot [32] registered Jul 9 12:59:27.730062 kernel: acpiphp: Slot [33] registered Jul 9 12:59:27.732338 kernel: acpiphp: Slot [34] registered Jul 9 12:59:27.732349 kernel: acpiphp: Slot [35] registered Jul 9 12:59:27.732356 kernel: acpiphp: Slot [36] registered Jul 9 12:59:27.732362 kernel: acpiphp: Slot [37] registered Jul 9 12:59:27.732368 kernel: acpiphp: Slot [38] registered Jul 9 12:59:27.732374 kernel: acpiphp: Slot [39] registered Jul 9 12:59:27.732379 kernel: acpiphp: Slot [40] registered Jul 9 12:59:27.732385 kernel: acpiphp: Slot [41] registered Jul 9 12:59:27.732394 kernel: acpiphp: Slot [42] registered Jul 9 12:59:27.732399 kernel: acpiphp: Slot [43] registered Jul 9 12:59:27.732406 kernel: acpiphp: Slot [44] registered Jul 9 12:59:27.732411 kernel: acpiphp: Slot [45] registered Jul 9 12:59:27.732417 kernel: acpiphp: Slot [46] registered Jul 9 12:59:27.732423 kernel: acpiphp: Slot [47] registered Jul 9 12:59:27.732429 kernel: acpiphp: Slot [48] registered Jul 9 12:59:27.732435 kernel: acpiphp: Slot [49] registered Jul 9 12:59:27.732440 kernel: acpiphp: Slot [50] registered Jul 9 12:59:27.732446 kernel: acpiphp: Slot [51] registered Jul 9 12:59:27.732453 kernel: acpiphp: Slot [52] registered Jul 9 12:59:27.732459 kernel: acpiphp: Slot [53] registered Jul 9 12:59:27.732465 kernel: acpiphp: Slot [54] registered Jul 9 12:59:27.732471 kernel: acpiphp: Slot [55] registered Jul 9 12:59:27.732477 kernel: acpiphp: Slot [56] registered Jul 9 12:59:27.732483 kernel: acpiphp: Slot [57] registered Jul 9 12:59:27.732489 kernel: acpiphp: Slot [58] registered Jul 9 12:59:27.732495 kernel: acpiphp: Slot [59] registered Jul 9 12:59:27.732501 kernel: acpiphp: Slot [60] registered Jul 9 12:59:27.732508 kernel: acpiphp: Slot [61] registered Jul 9 12:59:27.732513 kernel: acpiphp: Slot [62] registered Jul 9 12:59:27.732519 kernel: acpiphp: Slot [63] registered Jul 9 12:59:27.732591 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jul 9 12:59:27.732646 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jul 9 12:59:27.732698 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jul 9 12:59:27.732748 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jul 9 12:59:27.732798 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jul 9 12:59:27.732850 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jul 9 12:59:27.732917 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Jul 9 12:59:27.732970 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Jul 9 12:59:27.733021 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Jul 9 12:59:27.733089 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Jul 9 12:59:27.733147 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jul 9 12:59:27.733203 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jul 9 12:59:27.733261 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jul 9 12:59:27.734361 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jul 9 12:59:27.734430 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jul 9 12:59:27.734488 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jul 9 12:59:27.734542 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jul 9 12:59:27.734596 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jul 9 12:59:27.734649 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jul 9 12:59:27.734701 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jul 9 12:59:27.734763 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Jul 9 12:59:27.734816 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Jul 9 12:59:27.734869 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Jul 9 12:59:27.734920 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Jul 9 12:59:27.734970 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Jul 9 12:59:27.735021 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Jul 9 12:59:27.735072 kernel: pci 0000:0b:00.0: supports D1 D2 Jul 9 12:59:27.735126 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jul 9 12:59:27.735177 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jul 9 12:59:27.736391 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jul 9 12:59:27.736450 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jul 9 12:59:27.736505 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jul 9 12:59:27.736558 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jul 9 12:59:27.736610 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jul 9 12:59:27.736667 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jul 9 12:59:27.736719 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jul 9 12:59:27.736772 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jul 9 12:59:27.736824 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jul 9 12:59:27.736876 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jul 9 12:59:27.736927 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jul 9 12:59:27.736979 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jul 9 12:59:27.737031 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jul 9 12:59:27.737086 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jul 9 12:59:27.737139 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jul 9 12:59:27.737191 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jul 9 12:59:27.737244 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jul 9 12:59:27.737296 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jul 9 12:59:27.737356 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jul 9 12:59:27.737409 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jul 9 12:59:27.737462 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jul 9 12:59:27.737515 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jul 9 12:59:27.737567 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jul 9 12:59:27.737618 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jul 9 12:59:27.737627 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jul 9 12:59:27.737634 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jul 9 12:59:27.737640 kernel: ACPI: PCI: Interrupt link LNKB disabled Jul 9 12:59:27.737646 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jul 9 12:59:27.737654 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jul 9 12:59:27.737660 kernel: iommu: Default domain type: Translated Jul 9 12:59:27.737666 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jul 9 12:59:27.737671 kernel: PCI: Using ACPI for IRQ routing Jul 9 12:59:27.737677 kernel: PCI: pci_cache_line_size set to 64 bytes Jul 9 12:59:27.737684 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jul 9 12:59:27.737689 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jul 9 12:59:27.737739 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jul 9 12:59:27.737788 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jul 9 12:59:27.737838 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jul 9 12:59:27.737847 kernel: vgaarb: loaded Jul 9 12:59:27.737853 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jul 9 12:59:27.737859 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jul 9 12:59:27.737865 kernel: clocksource: Switched to clocksource tsc-early Jul 9 12:59:27.737871 kernel: VFS: Disk quotas dquot_6.6.0 Jul 9 12:59:27.737877 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 9 12:59:27.737883 kernel: pnp: PnP ACPI init Jul 9 12:59:27.737937 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jul 9 12:59:27.737987 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jul 9 12:59:27.738032 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jul 9 12:59:27.738081 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jul 9 12:59:27.738130 kernel: pnp 00:06: [dma 2] Jul 9 12:59:27.738181 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jul 9 12:59:27.738227 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jul 9 12:59:27.738275 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jul 9 12:59:27.738283 kernel: pnp: PnP ACPI: found 8 devices Jul 9 12:59:27.738289 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jul 9 12:59:27.738295 kernel: NET: Registered PF_INET protocol family Jul 9 12:59:27.738872 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 9 12:59:27.738882 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jul 9 12:59:27.738888 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 9 12:59:27.738894 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jul 9 12:59:27.738906 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jul 9 12:59:27.738912 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jul 9 12:59:27.738918 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 9 12:59:27.738924 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 9 12:59:27.738930 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 9 12:59:27.738936 kernel: NET: Registered PF_XDP protocol family Jul 9 12:59:27.738999 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jul 9 12:59:27.739055 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jul 9 12:59:27.739110 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 9 12:59:27.739165 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 9 12:59:27.739217 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 9 12:59:27.739269 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jul 9 12:59:27.739335 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jul 9 12:59:27.739388 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jul 9 12:59:27.739439 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jul 9 12:59:27.739490 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jul 9 12:59:27.739543 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jul 9 12:59:27.739594 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jul 9 12:59:27.739646 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jul 9 12:59:27.739698 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jul 9 12:59:27.739749 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jul 9 12:59:27.739801 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jul 9 12:59:27.739853 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jul 9 12:59:27.739904 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jul 9 12:59:27.739958 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jul 9 12:59:27.740009 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jul 9 12:59:27.740061 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jul 9 12:59:27.740113 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jul 9 12:59:27.740165 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jul 9 12:59:27.740215 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Jul 9 12:59:27.740264 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Jul 9 12:59:27.740650 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.740711 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.740764 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.740815 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.740867 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.740917 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.740969 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741020 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.741072 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741124 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.741174 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741224 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.741275 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741340 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.741393 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741452 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.741625 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741683 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.741735 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741785 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.741837 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741888 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.741944 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.741996 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742049 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742099 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742151 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742201 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742252 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742317 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742377 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742427 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742482 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742532 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742583 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742633 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742686 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742736 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742787 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742837 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742891 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.742948 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.742996 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743046 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743095 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743144 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743193 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743243 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743292 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743354 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743404 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743453 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743501 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743550 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743600 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743648 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743697 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743746 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743798 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743847 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743897 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.743947 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.743997 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744046 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744097 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744146 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744196 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744245 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744299 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744372 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744422 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744472 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744522 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744572 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744625 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744675 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744726 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744775 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744825 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744875 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.744927 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.744977 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.745027 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Jul 9 12:59:27.745080 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Jul 9 12:59:27.745130 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jul 9 12:59:27.745181 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jul 9 12:59:27.745230 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jul 9 12:59:27.745279 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jul 9 12:59:27.745407 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jul 9 12:59:27.745465 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Jul 9 12:59:27.745517 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jul 9 12:59:27.747033 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jul 9 12:59:27.747097 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jul 9 12:59:27.747163 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jul 9 12:59:27.747217 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jul 9 12:59:27.747270 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jul 9 12:59:27.747348 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jul 9 12:59:27.747427 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jul 9 12:59:27.747480 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jul 9 12:59:27.747531 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jul 9 12:59:27.747581 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jul 9 12:59:27.747634 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jul 9 12:59:27.747685 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jul 9 12:59:27.747735 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jul 9 12:59:27.747785 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jul 9 12:59:27.747836 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jul 9 12:59:27.747886 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jul 9 12:59:27.747942 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jul 9 12:59:27.747994 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jul 9 12:59:27.748046 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jul 9 12:59:27.748096 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jul 9 12:59:27.748147 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jul 9 12:59:27.748196 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jul 9 12:59:27.748245 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jul 9 12:59:27.748297 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jul 9 12:59:27.748400 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jul 9 12:59:27.748451 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jul 9 12:59:27.748509 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Jul 9 12:59:27.748561 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jul 9 12:59:27.748611 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jul 9 12:59:27.748660 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jul 9 12:59:27.748710 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jul 9 12:59:27.748760 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jul 9 12:59:27.748811 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jul 9 12:59:27.748860 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jul 9 12:59:27.748912 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jul 9 12:59:27.748966 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jul 9 12:59:27.749015 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jul 9 12:59:27.749065 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jul 9 12:59:27.749115 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jul 9 12:59:27.749168 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jul 9 12:59:27.749218 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jul 9 12:59:27.749267 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jul 9 12:59:27.750214 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jul 9 12:59:27.750275 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jul 9 12:59:27.750339 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jul 9 12:59:27.750393 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jul 9 12:59:27.750445 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jul 9 12:59:27.750495 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jul 9 12:59:27.750547 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jul 9 12:59:27.750596 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jul 9 12:59:27.750650 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jul 9 12:59:27.750702 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jul 9 12:59:27.750752 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jul 9 12:59:27.750801 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jul 9 12:59:27.750853 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jul 9 12:59:27.750907 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jul 9 12:59:27.750959 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jul 9 12:59:27.751009 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jul 9 12:59:27.751062 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jul 9 12:59:27.751113 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jul 9 12:59:27.751162 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jul 9 12:59:27.751211 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jul 9 12:59:27.751262 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jul 9 12:59:27.753329 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jul 9 12:59:27.753410 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jul 9 12:59:27.753466 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jul 9 12:59:27.753521 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jul 9 12:59:27.753573 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jul 9 12:59:27.753628 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jul 9 12:59:27.753680 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jul 9 12:59:27.754087 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jul 9 12:59:27.754143 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jul 9 12:59:27.754198 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jul 9 12:59:27.754249 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jul 9 12:59:27.754312 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jul 9 12:59:27.754381 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jul 9 12:59:27.754432 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jul 9 12:59:27.754482 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jul 9 12:59:27.754869 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jul 9 12:59:27.754925 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jul 9 12:59:27.754978 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jul 9 12:59:27.755031 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jul 9 12:59:27.755086 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jul 9 12:59:27.755137 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jul 9 12:59:27.755186 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jul 9 12:59:27.755238 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jul 9 12:59:27.755288 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jul 9 12:59:27.755391 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jul 9 12:59:27.755443 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jul 9 12:59:27.755495 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jul 9 12:59:27.755544 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jul 9 12:59:27.755595 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jul 9 12:59:27.755650 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jul 9 12:59:27.755700 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jul 9 12:59:27.755750 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jul 9 12:59:27.756010 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jul 9 12:59:27.756066 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jul 9 12:59:27.756117 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jul 9 12:59:27.756173 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jul 9 12:59:27.756224 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jul 9 12:59:27.756275 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jul 9 12:59:27.756351 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jul 9 12:59:27.756404 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jul 9 12:59:27.756454 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jul 9 12:59:27.756507 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jul 9 12:59:27.756911 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jul 9 12:59:27.756970 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jul 9 12:59:27.757022 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jul 9 12:59:27.757068 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jul 9 12:59:27.757112 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jul 9 12:59:27.757155 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jul 9 12:59:27.757198 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jul 9 12:59:27.757247 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jul 9 12:59:27.757296 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jul 9 12:59:27.757705 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jul 9 12:59:27.757755 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jul 9 12:59:27.757801 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jul 9 12:59:27.757848 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jul 9 12:59:27.757896 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jul 9 12:59:27.757941 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jul 9 12:59:27.757994 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jul 9 12:59:27.758041 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jul 9 12:59:27.758086 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jul 9 12:59:27.758136 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jul 9 12:59:27.758182 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jul 9 12:59:27.758227 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jul 9 12:59:27.758280 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jul 9 12:59:27.758355 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jul 9 12:59:27.758407 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jul 9 12:59:27.758458 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jul 9 12:59:27.758504 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jul 9 12:59:27.758553 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jul 9 12:59:27.758599 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jul 9 12:59:27.758651 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jul 9 12:59:27.758697 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jul 9 12:59:27.758746 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jul 9 12:59:27.758791 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jul 9 12:59:27.758843 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jul 9 12:59:27.758888 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jul 9 12:59:27.758942 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jul 9 12:59:27.758988 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jul 9 12:59:27.759033 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jul 9 12:59:27.759082 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jul 9 12:59:27.759127 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jul 9 12:59:27.759172 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jul 9 12:59:27.759223 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jul 9 12:59:27.759269 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jul 9 12:59:27.759540 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jul 9 12:59:27.759595 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jul 9 12:59:27.759642 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jul 9 12:59:27.759694 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jul 9 12:59:27.759740 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jul 9 12:59:27.759793 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jul 9 12:59:27.759839 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jul 9 12:59:27.759888 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jul 9 12:59:27.759934 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jul 9 12:59:27.759983 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jul 9 12:59:27.760029 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jul 9 12:59:27.760082 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jul 9 12:59:27.760128 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jul 9 12:59:27.760173 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jul 9 12:59:27.760222 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jul 9 12:59:27.760390 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jul 9 12:59:27.760437 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jul 9 12:59:27.760486 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jul 9 12:59:27.760535 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jul 9 12:59:27.760580 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jul 9 12:59:27.760629 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jul 9 12:59:27.760674 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jul 9 12:59:27.760726 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jul 9 12:59:27.760771 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jul 9 12:59:27.760824 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jul 9 12:59:27.760869 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jul 9 12:59:27.760918 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jul 9 12:59:27.760962 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jul 9 12:59:27.761012 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jul 9 12:59:27.761057 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jul 9 12:59:27.761108 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jul 9 12:59:27.761153 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jul 9 12:59:27.761198 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jul 9 12:59:27.761249 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jul 9 12:59:27.761293 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jul 9 12:59:27.761349 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jul 9 12:59:27.761398 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jul 9 12:59:27.761447 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jul 9 12:59:27.761498 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jul 9 12:59:27.761543 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jul 9 12:59:27.761593 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jul 9 12:59:27.761639 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jul 9 12:59:27.761690 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jul 9 12:59:27.761738 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jul 9 12:59:27.761788 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jul 9 12:59:27.761833 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jul 9 12:59:27.761882 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jul 9 12:59:27.761927 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jul 9 12:59:27.761984 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jul 9 12:59:27.761994 kernel: PCI: CLS 32 bytes, default 64 Jul 9 12:59:27.762003 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jul 9 12:59:27.762010 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jul 9 12:59:27.762016 kernel: clocksource: Switched to clocksource tsc Jul 9 12:59:27.762022 kernel: Initialise system trusted keyrings Jul 9 12:59:27.762028 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jul 9 12:59:27.762034 kernel: Key type asymmetric registered Jul 9 12:59:27.762040 kernel: Asymmetric key parser 'x509' registered Jul 9 12:59:27.762045 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jul 9 12:59:27.762051 kernel: io scheduler mq-deadline registered Jul 9 12:59:27.762059 kernel: io scheduler kyber registered Jul 9 12:59:27.762064 kernel: io scheduler bfq registered Jul 9 12:59:27.762119 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jul 9 12:59:27.762172 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.762224 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jul 9 12:59:27.762274 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.762347 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jul 9 12:59:27.762403 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.762455 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jul 9 12:59:27.762505 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.762556 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jul 9 12:59:27.762607 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.762659 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jul 9 12:59:27.762709 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.762763 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jul 9 12:59:27.762814 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.762867 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jul 9 12:59:27.762929 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.762984 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jul 9 12:59:27.763034 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763086 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jul 9 12:59:27.763139 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763190 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jul 9 12:59:27.763240 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763291 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jul 9 12:59:27.763365 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763429 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jul 9 12:59:27.763482 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763535 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jul 9 12:59:27.763588 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763640 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jul 9 12:59:27.763691 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763746 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jul 9 12:59:27.763798 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763852 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jul 9 12:59:27.763903 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.763957 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jul 9 12:59:27.764007 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764059 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jul 9 12:59:27.764110 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764161 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jul 9 12:59:27.764211 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764263 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jul 9 12:59:27.764334 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764392 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jul 9 12:59:27.764442 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764496 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jul 9 12:59:27.764546 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764599 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jul 9 12:59:27.764649 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764701 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jul 9 12:59:27.764754 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764806 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jul 9 12:59:27.764856 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.764908 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jul 9 12:59:27.764959 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.765011 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jul 9 12:59:27.765061 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.765112 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jul 9 12:59:27.765164 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.765216 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jul 9 12:59:27.765266 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.765331 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jul 9 12:59:27.765391 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.765443 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jul 9 12:59:27.765494 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 9 12:59:27.765505 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jul 9 12:59:27.765513 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 9 12:59:27.765520 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jul 9 12:59:27.765526 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jul 9 12:59:27.765533 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jul 9 12:59:27.765539 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jul 9 12:59:27.765592 kernel: rtc_cmos 00:01: registered as rtc0 Jul 9 12:59:27.765642 kernel: rtc_cmos 00:01: setting system clock to 2025-07-09T12:59:27 UTC (1752065967) Jul 9 12:59:27.765688 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jul 9 12:59:27.765697 kernel: intel_pstate: CPU model not supported Jul 9 12:59:27.765703 kernel: NET: Registered PF_INET6 protocol family Jul 9 12:59:27.765710 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Jul 9 12:59:27.765716 kernel: Segment Routing with IPv6 Jul 9 12:59:27.765723 kernel: In-situ OAM (IOAM) with IPv6 Jul 9 12:59:27.765729 kernel: NET: Registered PF_PACKET protocol family Jul 9 12:59:27.765735 kernel: Key type dns_resolver registered Jul 9 12:59:27.765743 kernel: IPI shorthand broadcast: enabled Jul 9 12:59:27.765749 kernel: sched_clock: Marking stable (2676003330, 171476718)->(2862191815, -14711767) Jul 9 12:59:27.765756 kernel: registered taskstats version 1 Jul 9 12:59:27.765762 kernel: Loading compiled-in X.509 certificates Jul 9 12:59:27.765768 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: 8ba3d283fde4a005aa35ab9394afe8122b8a3878' Jul 9 12:59:27.765774 kernel: Demotion targets for Node 0: null Jul 9 12:59:27.765780 kernel: Key type .fscrypt registered Jul 9 12:59:27.765787 kernel: Key type fscrypt-provisioning registered Jul 9 12:59:27.765793 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 9 12:59:27.765800 kernel: ima: Allocated hash algorithm: sha1 Jul 9 12:59:27.765807 kernel: ima: No architecture policies found Jul 9 12:59:27.765814 kernel: clk: Disabling unused clocks Jul 9 12:59:27.765820 kernel: Warning: unable to open an initial console. Jul 9 12:59:27.765827 kernel: Freeing unused kernel image (initmem) memory: 54568K Jul 9 12:59:27.765833 kernel: Write protecting the kernel read-only data: 24576k Jul 9 12:59:27.765839 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jul 9 12:59:27.765846 kernel: Run /init as init process Jul 9 12:59:27.765852 kernel: with arguments: Jul 9 12:59:27.765860 kernel: /init Jul 9 12:59:27.765866 kernel: with environment: Jul 9 12:59:27.765872 kernel: HOME=/ Jul 9 12:59:27.765878 kernel: TERM=linux Jul 9 12:59:27.765884 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 9 12:59:27.765892 systemd[1]: Successfully made /usr/ read-only. Jul 9 12:59:27.765900 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 9 12:59:27.765910 systemd[1]: Detected virtualization vmware. Jul 9 12:59:27.765919 systemd[1]: Detected architecture x86-64. Jul 9 12:59:27.765925 systemd[1]: Running in initrd. Jul 9 12:59:27.765931 systemd[1]: No hostname configured, using default hostname. Jul 9 12:59:27.765938 systemd[1]: Hostname set to . Jul 9 12:59:27.765944 systemd[1]: Initializing machine ID from random generator. Jul 9 12:59:27.765950 systemd[1]: Queued start job for default target initrd.target. Jul 9 12:59:27.765956 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 9 12:59:27.765963 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 9 12:59:27.765971 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 9 12:59:27.765978 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 9 12:59:27.765984 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 9 12:59:27.765991 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 9 12:59:27.765998 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 9 12:59:27.766005 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 9 12:59:27.766012 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 9 12:59:27.766019 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 9 12:59:27.766025 systemd[1]: Reached target paths.target - Path Units. Jul 9 12:59:27.766032 systemd[1]: Reached target slices.target - Slice Units. Jul 9 12:59:27.766039 systemd[1]: Reached target swap.target - Swaps. Jul 9 12:59:27.766045 systemd[1]: Reached target timers.target - Timer Units. Jul 9 12:59:27.766051 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 9 12:59:27.766058 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 9 12:59:27.766064 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 9 12:59:27.766073 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 9 12:59:27.766079 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 9 12:59:27.766086 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 9 12:59:27.766092 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 9 12:59:27.766099 systemd[1]: Reached target sockets.target - Socket Units. Jul 9 12:59:27.766105 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 9 12:59:27.766111 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 9 12:59:27.766118 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 9 12:59:27.766125 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 9 12:59:27.766133 systemd[1]: Starting systemd-fsck-usr.service... Jul 9 12:59:27.766139 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 9 12:59:27.766146 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 9 12:59:27.766152 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 12:59:27.766159 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 9 12:59:27.766179 systemd-journald[243]: Collecting audit messages is disabled. Jul 9 12:59:27.766196 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 9 12:59:27.766203 systemd[1]: Finished systemd-fsck-usr.service. Jul 9 12:59:27.766211 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 9 12:59:27.766217 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 9 12:59:27.766224 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 12:59:27.766230 kernel: Bridge firewalling registered Jul 9 12:59:27.766237 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 9 12:59:27.766243 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 9 12:59:27.766250 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 9 12:59:27.766257 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 9 12:59:27.766265 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 9 12:59:27.766271 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 9 12:59:27.766278 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 9 12:59:27.766285 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 9 12:59:27.766292 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 9 12:59:27.766299 systemd-journald[243]: Journal started Jul 9 12:59:27.766329 systemd-journald[243]: Runtime Journal (/run/log/journal/da207275b91c46219a644a93b1429703) is 4.8M, max 38.8M, 34M free. Jul 9 12:59:27.707279 systemd-modules-load[245]: Inserted module 'overlay' Jul 9 12:59:27.735408 systemd-modules-load[245]: Inserted module 'br_netfilter' Jul 9 12:59:27.768320 systemd[1]: Started systemd-journald.service - Journal Service. Jul 9 12:59:27.773766 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 9 12:59:27.779754 systemd-tmpfiles[280]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 9 12:59:27.781562 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 9 12:59:27.782646 dracut-cmdline[270]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f85d3be94c634d7d72fbcd0e670073ce56ae2e0cc763f83b329300b7cea5203d Jul 9 12:59:27.784403 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 9 12:59:27.808692 systemd-resolved[294]: Positive Trust Anchors: Jul 9 12:59:27.808943 systemd-resolved[294]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 9 12:59:27.809110 systemd-resolved[294]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 9 12:59:27.811379 systemd-resolved[294]: Defaulting to hostname 'linux'. Jul 9 12:59:27.812079 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 9 12:59:27.812355 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 9 12:59:27.832321 kernel: SCSI subsystem initialized Jul 9 12:59:27.849325 kernel: Loading iSCSI transport class v2.0-870. Jul 9 12:59:27.857320 kernel: iscsi: registered transport (tcp) Jul 9 12:59:27.879319 kernel: iscsi: registered transport (qla4xxx) Jul 9 12:59:27.879360 kernel: QLogic iSCSI HBA Driver Jul 9 12:59:27.890053 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 9 12:59:27.899990 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 9 12:59:27.901154 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 9 12:59:27.923457 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 9 12:59:27.924433 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 9 12:59:27.969343 kernel: raid6: avx2x4 gen() 38367 MB/s Jul 9 12:59:27.986323 kernel: raid6: avx2x2 gen() 51785 MB/s Jul 9 12:59:28.003579 kernel: raid6: avx2x1 gen() 44625 MB/s Jul 9 12:59:28.003625 kernel: raid6: using algorithm avx2x2 gen() 51785 MB/s Jul 9 12:59:28.021642 kernel: raid6: .... xor() 31157 MB/s, rmw enabled Jul 9 12:59:28.021661 kernel: raid6: using avx2x2 recovery algorithm Jul 9 12:59:28.035314 kernel: xor: automatically using best checksumming function avx Jul 9 12:59:28.139322 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 9 12:59:28.142535 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 9 12:59:28.143484 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 9 12:59:28.165197 systemd-udevd[492]: Using default interface naming scheme 'v255'. Jul 9 12:59:28.168691 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 9 12:59:28.169693 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 9 12:59:28.184826 dracut-pre-trigger[498]: rd.md=0: removing MD RAID activation Jul 9 12:59:28.198575 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 9 12:59:28.199528 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 9 12:59:28.271876 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 9 12:59:28.273533 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 9 12:59:28.341321 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jul 9 12:59:28.348333 kernel: vmw_pvscsi: using 64bit dma Jul 9 12:59:28.349450 kernel: vmw_pvscsi: max_id: 16 Jul 9 12:59:28.349467 kernel: vmw_pvscsi: setting ring_pages to 8 Jul 9 12:59:28.358780 kernel: vmw_pvscsi: enabling reqCallThreshold Jul 9 12:59:28.358808 kernel: vmw_pvscsi: driver-based request coalescing enabled Jul 9 12:59:28.358818 kernel: vmw_pvscsi: using MSI-X Jul 9 12:59:28.363317 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jul 9 12:59:28.369415 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jul 9 12:59:28.374316 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jul 9 12:59:28.376733 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 9 12:59:28.377031 (udev-worker)[537]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jul 9 12:59:28.377091 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 12:59:28.377808 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 12:59:28.379495 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 12:59:28.384722 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Jul 9 12:59:28.384737 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jul 9 12:59:28.388312 kernel: cryptd: max_cpu_qlen set to 1000 Jul 9 12:59:28.390318 kernel: libata version 3.00 loaded. Jul 9 12:59:28.393319 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jul 9 12:59:28.399316 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jul 9 12:59:28.399437 kernel: sd 0:0:0:0: [sda] Write Protect is off Jul 9 12:59:28.399505 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jul 9 12:59:28.399567 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jul 9 12:59:28.399626 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jul 9 12:59:28.404326 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jul 9 12:59:28.409210 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 12:59:28.410354 kernel: ata_piix 0000:00:07.1: version 2.13 Jul 9 12:59:28.410465 kernel: scsi host1: ata_piix Jul 9 12:59:28.412338 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Jul 9 12:59:28.412355 kernel: scsi host2: ata_piix Jul 9 12:59:28.413566 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Jul 9 12:59:28.413583 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Jul 9 12:59:28.415325 kernel: AES CTR mode by8 optimization enabled Jul 9 12:59:28.420772 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 9 12:59:28.420804 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jul 9 12:59:28.587327 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jul 9 12:59:28.590312 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jul 9 12:59:28.612319 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jul 9 12:59:28.612470 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 9 12:59:28.620860 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jul 9 12:59:28.639017 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jul 9 12:59:28.644742 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jul 9 12:59:28.650218 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jul 9 12:59:28.654604 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jul 9 12:59:28.654744 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jul 9 12:59:28.656367 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 9 12:59:28.699371 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 9 12:59:28.715330 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 9 12:59:28.891321 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 9 12:59:28.891680 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 9 12:59:28.891820 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 9 12:59:28.892015 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 9 12:59:28.892673 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 9 12:59:28.905688 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 9 12:59:29.713439 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 9 12:59:29.714260 disk-uuid[647]: The operation has completed successfully. Jul 9 12:59:29.754783 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 9 12:59:29.754855 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 9 12:59:29.774056 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 9 12:59:29.788230 sh[677]: Success Jul 9 12:59:29.802479 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 9 12:59:29.802521 kernel: device-mapper: uevent: version 1.0.3 Jul 9 12:59:29.803638 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 9 12:59:29.810339 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Jul 9 12:59:29.848486 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 9 12:59:29.851345 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 9 12:59:29.863291 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 9 12:59:29.879318 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 9 12:59:29.882317 kernel: BTRFS: device fsid 082bcfbc-2c86-46fe-87f4-85dea5450235 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (689) Jul 9 12:59:29.886119 kernel: BTRFS info (device dm-0): first mount of filesystem 082bcfbc-2c86-46fe-87f4-85dea5450235 Jul 9 12:59:29.886146 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jul 9 12:59:29.886154 kernel: BTRFS info (device dm-0): using free-space-tree Jul 9 12:59:29.897338 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 9 12:59:29.897685 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 9 12:59:29.898281 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jul 9 12:59:29.900384 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 9 12:59:29.930946 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (712) Jul 9 12:59:29.930980 kernel: BTRFS info (device sda6): first mount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 12:59:29.930989 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 9 12:59:29.930997 kernel: BTRFS info (device sda6): using free-space-tree Jul 9 12:59:29.945406 kernel: BTRFS info (device sda6): last unmount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 12:59:29.946511 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 9 12:59:29.949411 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 9 12:59:30.017972 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jul 9 12:59:30.018857 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 9 12:59:30.079831 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 9 12:59:30.081095 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 9 12:59:30.086957 ignition[731]: Ignition 2.21.0 Jul 9 12:59:30.086967 ignition[731]: Stage: fetch-offline Jul 9 12:59:30.086986 ignition[731]: no configs at "/usr/lib/ignition/base.d" Jul 9 12:59:30.086991 ignition[731]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 9 12:59:30.087040 ignition[731]: parsed url from cmdline: "" Jul 9 12:59:30.087042 ignition[731]: no config URL provided Jul 9 12:59:30.087045 ignition[731]: reading system config file "/usr/lib/ignition/user.ign" Jul 9 12:59:30.087050 ignition[731]: no config at "/usr/lib/ignition/user.ign" Jul 9 12:59:30.087466 ignition[731]: config successfully fetched Jul 9 12:59:30.087486 ignition[731]: parsing config with SHA512: 23b9dc0d866bf0dbe633bfcb0ed06f6ca3195df6ba06d4906bb6059e6aa9d0b5b503ab8e28cf69b333c854ea2e3c5ec0b58961836143d2bef1d19e2585aeeec2 Jul 9 12:59:30.091584 unknown[731]: fetched base config from "system" Jul 9 12:59:30.092326 unknown[731]: fetched user config from "vmware" Jul 9 12:59:30.092554 ignition[731]: fetch-offline: fetch-offline passed Jul 9 12:59:30.092593 ignition[731]: Ignition finished successfully Jul 9 12:59:30.094090 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 9 12:59:30.108686 systemd-networkd[867]: lo: Link UP Jul 9 12:59:30.108693 systemd-networkd[867]: lo: Gained carrier Jul 9 12:59:30.109427 systemd-networkd[867]: Enumeration completed Jul 9 12:59:30.109505 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 9 12:59:30.109889 systemd-networkd[867]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jul 9 12:59:30.112462 systemd[1]: Reached target network.target - Network. Jul 9 12:59:30.113628 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jul 9 12:59:30.113740 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jul 9 12:59:30.113123 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jul 9 12:59:30.113212 systemd-networkd[867]: ens192: Link UP Jul 9 12:59:30.113214 systemd-networkd[867]: ens192: Gained carrier Jul 9 12:59:30.114500 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 9 12:59:30.130327 ignition[871]: Ignition 2.21.0 Jul 9 12:59:30.130335 ignition[871]: Stage: kargs Jul 9 12:59:30.130426 ignition[871]: no configs at "/usr/lib/ignition/base.d" Jul 9 12:59:30.130432 ignition[871]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 9 12:59:30.130933 ignition[871]: kargs: kargs passed Jul 9 12:59:30.130965 ignition[871]: Ignition finished successfully Jul 9 12:59:30.132335 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 9 12:59:30.133008 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 9 12:59:30.153123 ignition[879]: Ignition 2.21.0 Jul 9 12:59:30.153439 ignition[879]: Stage: disks Jul 9 12:59:30.153645 ignition[879]: no configs at "/usr/lib/ignition/base.d" Jul 9 12:59:30.153780 ignition[879]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 9 12:59:30.154425 ignition[879]: disks: disks passed Jul 9 12:59:30.154568 ignition[879]: Ignition finished successfully Jul 9 12:59:30.155517 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 9 12:59:30.155739 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 9 12:59:30.155856 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 9 12:59:30.156051 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 9 12:59:30.156248 systemd[1]: Reached target sysinit.target - System Initialization. Jul 9 12:59:30.156433 systemd[1]: Reached target basic.target - Basic System. Jul 9 12:59:30.157114 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 9 12:59:30.173403 systemd-fsck[888]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 9 12:59:30.174245 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 9 12:59:30.175109 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 9 12:59:30.259106 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 9 12:59:30.259343 kernel: EXT4-fs (sda9): mounted filesystem b08a603c-44fa-43af-af80-90bed9b8770a r/w with ordered data mode. Quota mode: none. Jul 9 12:59:30.259466 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 9 12:59:30.260402 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 9 12:59:30.261049 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 9 12:59:30.262527 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jul 9 12:59:30.262742 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 9 12:59:30.262966 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 9 12:59:30.267335 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 9 12:59:30.268085 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 9 12:59:30.274379 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (896) Jul 9 12:59:30.278000 kernel: BTRFS info (device sda6): first mount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 12:59:30.278017 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 9 12:59:30.278025 kernel: BTRFS info (device sda6): using free-space-tree Jul 9 12:59:30.281098 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 9 12:59:30.302874 initrd-setup-root[920]: cut: /sysroot/etc/passwd: No such file or directory Jul 9 12:59:30.305411 initrd-setup-root[927]: cut: /sysroot/etc/group: No such file or directory Jul 9 12:59:30.308150 initrd-setup-root[934]: cut: /sysroot/etc/shadow: No such file or directory Jul 9 12:59:30.310118 initrd-setup-root[941]: cut: /sysroot/etc/gshadow: No such file or directory Jul 9 12:59:30.363813 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 9 12:59:30.364641 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 9 12:59:30.365406 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 9 12:59:30.383329 kernel: BTRFS info (device sda6): last unmount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 12:59:30.396701 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 9 12:59:30.399999 ignition[1009]: INFO : Ignition 2.21.0 Jul 9 12:59:30.399999 ignition[1009]: INFO : Stage: mount Jul 9 12:59:30.400281 ignition[1009]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 9 12:59:30.400281 ignition[1009]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 9 12:59:30.400632 ignition[1009]: INFO : mount: mount passed Jul 9 12:59:30.401090 ignition[1009]: INFO : Ignition finished successfully Jul 9 12:59:30.401282 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 9 12:59:30.402066 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 9 12:59:30.879272 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 9 12:59:30.880525 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 9 12:59:30.906147 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1020) Jul 9 12:59:30.906181 kernel: BTRFS info (device sda6): first mount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 12:59:30.906189 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 9 12:59:30.907748 kernel: BTRFS info (device sda6): using free-space-tree Jul 9 12:59:30.910796 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 9 12:59:30.924928 ignition[1037]: INFO : Ignition 2.21.0 Jul 9 12:59:30.924928 ignition[1037]: INFO : Stage: files Jul 9 12:59:30.925332 ignition[1037]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 9 12:59:30.925332 ignition[1037]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 9 12:59:30.925596 ignition[1037]: DEBUG : files: compiled without relabeling support, skipping Jul 9 12:59:30.926251 ignition[1037]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 9 12:59:30.926251 ignition[1037]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 9 12:59:30.927773 ignition[1037]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 9 12:59:30.927923 ignition[1037]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 9 12:59:30.928106 ignition[1037]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 9 12:59:30.928096 unknown[1037]: wrote ssh authorized keys file for user: core Jul 9 12:59:30.929818 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jul 9 12:59:30.930078 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jul 9 12:59:30.968299 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 9 12:59:31.420428 systemd-networkd[867]: ens192: Gained IPv6LL Jul 9 12:59:31.519770 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jul 9 12:59:31.520020 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 9 12:59:31.520020 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 9 12:59:31.520020 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 9 12:59:31.520020 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 9 12:59:31.520020 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 9 12:59:31.520767 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 9 12:59:31.520767 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 9 12:59:31.520767 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 9 12:59:31.521208 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 9 12:59:31.521360 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 9 12:59:31.521360 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 9 12:59:31.523565 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 9 12:59:31.523787 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 9 12:59:31.523787 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jul 9 12:59:32.329706 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 9 12:59:33.160217 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 9 12:59:33.160533 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jul 9 12:59:33.160953 ignition[1037]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jul 9 12:59:33.161137 ignition[1037]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jul 9 12:59:33.161802 ignition[1037]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 9 12:59:33.162155 ignition[1037]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 9 12:59:33.162155 ignition[1037]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jul 9 12:59:33.162470 ignition[1037]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jul 9 12:59:33.162470 ignition[1037]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jul 9 12:59:33.162470 ignition[1037]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jul 9 12:59:33.162470 ignition[1037]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jul 9 12:59:33.162470 ignition[1037]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jul 9 12:59:33.192985 ignition[1037]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jul 9 12:59:33.195312 ignition[1037]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jul 9 12:59:33.195524 ignition[1037]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jul 9 12:59:33.195524 ignition[1037]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jul 9 12:59:33.195524 ignition[1037]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jul 9 12:59:33.195524 ignition[1037]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 9 12:59:33.196756 ignition[1037]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 9 12:59:33.196756 ignition[1037]: INFO : files: files passed Jul 9 12:59:33.196756 ignition[1037]: INFO : Ignition finished successfully Jul 9 12:59:33.196588 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 9 12:59:33.198390 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 9 12:59:33.199059 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 9 12:59:33.212740 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 9 12:59:33.212800 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 9 12:59:33.216191 initrd-setup-root-after-ignition[1069]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 9 12:59:33.216191 initrd-setup-root-after-ignition[1069]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 9 12:59:33.217383 initrd-setup-root-after-ignition[1073]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 9 12:59:33.218786 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 9 12:59:33.219207 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 9 12:59:33.220105 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 9 12:59:33.259773 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 9 12:59:33.259863 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 9 12:59:33.260158 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 9 12:59:33.260292 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 9 12:59:33.260662 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 9 12:59:33.261218 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 9 12:59:33.270807 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 9 12:59:33.272057 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 9 12:59:33.283575 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 9 12:59:33.283965 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 9 12:59:33.284282 systemd[1]: Stopped target timers.target - Timer Units. Jul 9 12:59:33.284440 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 9 12:59:33.284523 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 9 12:59:33.284773 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 9 12:59:33.284918 systemd[1]: Stopped target basic.target - Basic System. Jul 9 12:59:33.285068 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 9 12:59:33.285217 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 9 12:59:33.286212 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 9 12:59:33.286377 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 9 12:59:33.286779 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 9 12:59:33.287052 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 9 12:59:33.287406 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 9 12:59:33.287718 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 9 12:59:33.288013 systemd[1]: Stopped target swap.target - Swaps. Jul 9 12:59:33.288258 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 9 12:59:33.288471 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 9 12:59:33.288865 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 9 12:59:33.289158 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 9 12:59:33.289433 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 9 12:59:33.289617 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 9 12:59:33.289922 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 9 12:59:33.289992 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 9 12:59:33.290466 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 9 12:59:33.290543 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 9 12:59:33.291058 systemd[1]: Stopped target paths.target - Path Units. Jul 9 12:59:33.291327 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 9 12:59:33.291560 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 9 12:59:33.291919 systemd[1]: Stopped target slices.target - Slice Units. Jul 9 12:59:33.292231 systemd[1]: Stopped target sockets.target - Socket Units. Jul 9 12:59:33.292545 systemd[1]: iscsid.socket: Deactivated successfully. Jul 9 12:59:33.292624 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 9 12:59:33.293097 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 9 12:59:33.293175 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 9 12:59:33.293667 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 9 12:59:33.293767 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 9 12:59:33.294299 systemd[1]: ignition-files.service: Deactivated successfully. Jul 9 12:59:33.294392 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 9 12:59:33.295534 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 9 12:59:33.297485 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 9 12:59:33.297893 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 9 12:59:33.298158 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 9 12:59:33.298589 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 9 12:59:33.298812 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 9 12:59:33.304082 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 9 12:59:33.304669 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 9 12:59:33.313847 ignition[1093]: INFO : Ignition 2.21.0 Jul 9 12:59:33.313847 ignition[1093]: INFO : Stage: umount Jul 9 12:59:33.314258 ignition[1093]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 9 12:59:33.314258 ignition[1093]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 9 12:59:33.316248 ignition[1093]: INFO : umount: umount passed Jul 9 12:59:33.316248 ignition[1093]: INFO : Ignition finished successfully Jul 9 12:59:33.317173 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 9 12:59:33.317581 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 9 12:59:33.318506 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 9 12:59:33.318946 systemd[1]: Stopped target network.target - Network. Jul 9 12:59:33.319070 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 9 12:59:33.319107 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 9 12:59:33.319244 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 9 12:59:33.319269 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 9 12:59:33.319438 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 9 12:59:33.319461 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 9 12:59:33.319607 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 9 12:59:33.319629 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 9 12:59:33.319908 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 9 12:59:33.320232 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 9 12:59:33.325654 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 9 12:59:33.325746 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 9 12:59:33.327283 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 9 12:59:33.328278 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 9 12:59:33.328400 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 9 12:59:33.329163 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 9 12:59:33.329461 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 9 12:59:33.329617 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 9 12:59:33.329638 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 9 12:59:33.330258 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 9 12:59:33.330498 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 9 12:59:33.330526 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 9 12:59:33.330659 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jul 9 12:59:33.330682 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jul 9 12:59:33.330807 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 9 12:59:33.330827 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 9 12:59:33.332130 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 9 12:59:33.332156 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 9 12:59:33.332716 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 9 12:59:33.332741 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 9 12:59:33.333222 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 9 12:59:33.334155 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 9 12:59:33.334191 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 9 12:59:33.341236 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 9 12:59:33.341570 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 9 12:59:33.350598 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 9 12:59:33.350703 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 9 12:59:33.351114 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 9 12:59:33.351149 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 9 12:59:33.351281 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 9 12:59:33.351297 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 9 12:59:33.351456 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 9 12:59:33.351481 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 9 12:59:33.351758 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 9 12:59:33.351785 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 9 12:59:33.352126 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 9 12:59:33.352152 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 9 12:59:33.353377 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 9 12:59:33.353493 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 9 12:59:33.353522 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 9 12:59:33.354359 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 9 12:59:33.354386 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 9 12:59:33.355370 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jul 9 12:59:33.355399 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 9 12:59:33.355847 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 9 12:59:33.355873 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 9 12:59:33.356252 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 9 12:59:33.356427 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 12:59:33.357514 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jul 9 12:59:33.357685 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Jul 9 12:59:33.357707 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jul 9 12:59:33.357728 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 9 12:59:33.362589 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 9 12:59:33.362645 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 9 12:59:33.536901 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 9 12:59:33.536977 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 9 12:59:33.537408 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 9 12:59:33.537687 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 9 12:59:33.537727 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 9 12:59:33.538491 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 9 12:59:33.553524 systemd[1]: Switching root. Jul 9 12:59:33.587530 systemd-journald[243]: Journal stopped Jul 9 12:59:35.672110 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Jul 9 12:59:35.672129 kernel: SELinux: policy capability network_peer_controls=1 Jul 9 12:59:35.672137 kernel: SELinux: policy capability open_perms=1 Jul 9 12:59:35.672143 kernel: SELinux: policy capability extended_socket_class=1 Jul 9 12:59:35.672147 kernel: SELinux: policy capability always_check_network=0 Jul 9 12:59:35.672154 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 9 12:59:35.672160 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 9 12:59:35.672166 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 9 12:59:35.672171 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 9 12:59:35.672176 kernel: SELinux: policy capability userspace_initial_context=0 Jul 9 12:59:35.672182 systemd[1]: Successfully loaded SELinux policy in 105.077ms. Jul 9 12:59:35.672189 kernel: audit: type=1403 audit(1752065974.584:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 9 12:59:35.672196 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.778ms. Jul 9 12:59:35.672203 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 9 12:59:35.672209 systemd[1]: Detected virtualization vmware. Jul 9 12:59:35.672216 systemd[1]: Detected architecture x86-64. Jul 9 12:59:35.672223 systemd[1]: Detected first boot. Jul 9 12:59:35.672229 systemd[1]: Initializing machine ID from random generator. Jul 9 12:59:35.672236 zram_generator::config[1136]: No configuration found. Jul 9 12:59:35.674075 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Jul 9 12:59:35.674092 kernel: Guest personality initialized and is active Jul 9 12:59:35.674099 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jul 9 12:59:35.674105 kernel: Initialized host personality Jul 9 12:59:35.674114 kernel: NET: Registered PF_VSOCK protocol family Jul 9 12:59:35.674121 systemd[1]: Populated /etc with preset unit settings. Jul 9 12:59:35.674129 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 9 12:59:35.674136 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Jul 9 12:59:35.674144 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 9 12:59:35.674150 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 9 12:59:35.674156 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 9 12:59:35.674164 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 9 12:59:35.674171 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 9 12:59:35.674178 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 9 12:59:35.674184 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 9 12:59:35.674191 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 9 12:59:35.674198 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 9 12:59:35.674205 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 9 12:59:35.674213 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 9 12:59:35.674221 systemd[1]: Created slice user.slice - User and Session Slice. Jul 9 12:59:35.674227 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 9 12:59:35.674236 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 9 12:59:35.674243 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 9 12:59:35.674249 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 9 12:59:35.674256 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 9 12:59:35.674263 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 9 12:59:35.674271 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jul 9 12:59:35.674278 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 9 12:59:35.674284 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 9 12:59:35.674291 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 9 12:59:35.674298 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 9 12:59:35.674314 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 9 12:59:35.674321 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 9 12:59:35.674328 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 9 12:59:35.674336 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 9 12:59:35.674343 systemd[1]: Reached target slices.target - Slice Units. Jul 9 12:59:35.674350 systemd[1]: Reached target swap.target - Swaps. Jul 9 12:59:35.674356 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 9 12:59:35.674364 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 9 12:59:35.674372 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 9 12:59:35.674379 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 9 12:59:35.674385 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 9 12:59:35.674392 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 9 12:59:35.674399 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 9 12:59:35.674406 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 9 12:59:35.674413 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 9 12:59:35.674420 systemd[1]: Mounting media.mount - External Media Directory... Jul 9 12:59:35.674428 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 12:59:35.674435 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 9 12:59:35.674441 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 9 12:59:35.674448 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 9 12:59:35.674455 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 9 12:59:35.674462 systemd[1]: Reached target machines.target - Containers. Jul 9 12:59:35.674469 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 9 12:59:35.674476 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Jul 9 12:59:35.674484 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 9 12:59:35.674491 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 9 12:59:35.674497 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 9 12:59:35.674504 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 9 12:59:35.674511 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 9 12:59:35.674518 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 9 12:59:35.674525 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 9 12:59:35.674532 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 9 12:59:35.674540 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 9 12:59:35.674547 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 9 12:59:35.674554 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 9 12:59:35.674561 systemd[1]: Stopped systemd-fsck-usr.service. Jul 9 12:59:35.674568 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 9 12:59:35.674575 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 9 12:59:35.674583 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 9 12:59:35.674590 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 9 12:59:35.674598 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 9 12:59:35.674605 kernel: loop: module loaded Jul 9 12:59:35.674611 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 9 12:59:35.674618 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 9 12:59:35.674624 systemd[1]: verity-setup.service: Deactivated successfully. Jul 9 12:59:35.674631 systemd[1]: Stopped verity-setup.service. Jul 9 12:59:35.674638 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 12:59:35.674645 kernel: fuse: init (API version 7.41) Jul 9 12:59:35.674651 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 9 12:59:35.674659 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 9 12:59:35.674666 systemd[1]: Mounted media.mount - External Media Directory. Jul 9 12:59:35.674673 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 9 12:59:35.674680 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 9 12:59:35.674687 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 9 12:59:35.674694 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 9 12:59:35.674702 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 9 12:59:35.674723 systemd-journald[1219]: Collecting audit messages is disabled. Jul 9 12:59:35.674741 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 9 12:59:35.674749 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 9 12:59:35.674756 systemd-journald[1219]: Journal started Jul 9 12:59:35.674776 systemd-journald[1219]: Runtime Journal (/run/log/journal/4ef3cba2c2db49b2af50eafac25f6064) is 4.8M, max 38.8M, 34M free. Jul 9 12:59:35.495779 systemd[1]: Queued start job for default target multi-user.target. Jul 9 12:59:35.508608 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 9 12:59:35.508862 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 9 12:59:35.675266 jq[1206]: true Jul 9 12:59:35.675385 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 9 12:59:35.676822 systemd[1]: Started systemd-journald.service - Journal Service. Jul 9 12:59:35.677893 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 9 12:59:35.678414 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 9 12:59:35.678662 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 9 12:59:35.678765 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 9 12:59:35.678965 jq[1234]: true Jul 9 12:59:35.678993 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 9 12:59:35.679091 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 9 12:59:35.679466 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 9 12:59:35.679745 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 9 12:59:35.680007 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 9 12:59:35.689233 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 9 12:59:35.694685 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 9 12:59:35.699556 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 9 12:59:35.700558 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 9 12:59:35.700582 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 9 12:59:35.701288 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 9 12:59:35.705370 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 9 12:59:35.705792 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 9 12:59:35.708526 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 9 12:59:35.710411 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 9 12:59:35.710583 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 9 12:59:35.716263 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 9 12:59:35.716460 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 9 12:59:35.728495 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 9 12:59:35.735316 systemd-journald[1219]: Time spent on flushing to /var/log/journal/4ef3cba2c2db49b2af50eafac25f6064 is 37.134ms for 1752 entries. Jul 9 12:59:35.735316 systemd-journald[1219]: System Journal (/var/log/journal/4ef3cba2c2db49b2af50eafac25f6064) is 8M, max 584.8M, 576.8M free. Jul 9 12:59:35.814639 systemd-journald[1219]: Received client request to flush runtime journal. Jul 9 12:59:35.736899 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 9 12:59:35.777046 ignition[1240]: Ignition 2.21.0 Jul 9 12:59:35.740502 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 9 12:59:35.777262 ignition[1240]: deleting config from guestinfo properties Jul 9 12:59:35.742386 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 9 12:59:35.743556 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 9 12:59:35.743733 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 9 12:59:35.744030 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 9 12:59:35.747018 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 9 12:59:35.755462 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 9 12:59:35.758376 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 9 12:59:35.818165 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 9 12:59:35.825106 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 9 12:59:35.836697 ignition[1240]: Successfully deleted config Jul 9 12:59:35.837328 kernel: loop0: detected capacity change from 0 to 114008 Jul 9 12:59:35.839151 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Jul 9 12:59:35.840347 kernel: ACPI: bus type drm_connector registered Jul 9 12:59:35.842156 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 9 12:59:35.842797 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 9 12:59:35.849361 systemd-tmpfiles[1284]: ACLs are not supported, ignoring. Jul 9 12:59:35.849374 systemd-tmpfiles[1284]: ACLs are not supported, ignoring. Jul 9 12:59:35.853359 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 9 12:59:35.856500 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 9 12:59:35.859531 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 9 12:59:35.890909 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 9 12:59:35.995334 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 9 12:59:36.006558 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 9 12:59:36.008398 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 9 12:59:36.028355 kernel: loop1: detected capacity change from 0 to 146480 Jul 9 12:59:36.027926 systemd-tmpfiles[1307]: ACLs are not supported, ignoring. Jul 9 12:59:36.027938 systemd-tmpfiles[1307]: ACLs are not supported, ignoring. Jul 9 12:59:36.034007 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 9 12:59:36.124379 kernel: loop2: detected capacity change from 0 to 2960 Jul 9 12:59:36.149329 kernel: loop3: detected capacity change from 0 to 224512 Jul 9 12:59:36.249401 kernel: loop4: detected capacity change from 0 to 114008 Jul 9 12:59:36.265329 kernel: loop5: detected capacity change from 0 to 146480 Jul 9 12:59:36.352327 kernel: loop6: detected capacity change from 0 to 2960 Jul 9 12:59:36.378544 kernel: loop7: detected capacity change from 0 to 224512 Jul 9 12:59:36.414844 (sd-merge)[1314]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Jul 9 12:59:36.415288 (sd-merge)[1314]: Merged extensions into '/usr'. Jul 9 12:59:36.420516 systemd[1]: Reload requested from client PID 1283 ('systemd-sysext') (unit systemd-sysext.service)... Jul 9 12:59:36.420630 systemd[1]: Reloading... Jul 9 12:59:36.458324 zram_generator::config[1336]: No configuration found. Jul 9 12:59:36.555139 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 9 12:59:36.564557 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 9 12:59:36.609404 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 9 12:59:36.609844 systemd[1]: Reloading finished in 188 ms. Jul 9 12:59:36.626604 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 9 12:59:36.627013 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 9 12:59:36.635469 systemd[1]: Starting ensure-sysext.service... Jul 9 12:59:36.636537 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 9 12:59:36.637989 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 9 12:59:36.655625 systemd[1]: Reload requested from client PID 1396 ('systemctl') (unit ensure-sysext.service)... Jul 9 12:59:36.655635 systemd[1]: Reloading... Jul 9 12:59:36.659185 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 9 12:59:36.659211 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 9 12:59:36.659405 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 9 12:59:36.659562 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 9 12:59:36.660047 systemd-tmpfiles[1397]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 9 12:59:36.660210 systemd-tmpfiles[1397]: ACLs are not supported, ignoring. Jul 9 12:59:36.660244 systemd-tmpfiles[1397]: ACLs are not supported, ignoring. Jul 9 12:59:36.670618 systemd-tmpfiles[1397]: Detected autofs mount point /boot during canonicalization of boot. Jul 9 12:59:36.671178 systemd-udevd[1398]: Using default interface naming scheme 'v255'. Jul 9 12:59:36.671577 systemd-tmpfiles[1397]: Skipping /boot Jul 9 12:59:36.675289 systemd-tmpfiles[1397]: Detected autofs mount point /boot during canonicalization of boot. Jul 9 12:59:36.675297 systemd-tmpfiles[1397]: Skipping /boot Jul 9 12:59:36.694329 zram_generator::config[1423]: No configuration found. Jul 9 12:59:36.764572 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 9 12:59:36.772474 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 9 12:59:36.817686 systemd[1]: Reloading finished in 161 ms. Jul 9 12:59:36.825014 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 9 12:59:36.825380 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 9 12:59:36.836282 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 9 12:59:36.842971 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 9 12:59:36.847238 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 9 12:59:36.849149 ldconfig[1270]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 9 12:59:36.856144 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 9 12:59:36.862434 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 9 12:59:36.866428 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 9 12:59:36.866916 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 9 12:59:36.876676 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 12:59:36.879126 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 9 12:59:36.881018 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 9 12:59:36.883419 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 9 12:59:36.883591 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 9 12:59:36.883660 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 9 12:59:36.883723 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 12:59:36.886677 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 9 12:59:36.886990 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 9 12:59:36.887633 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 9 12:59:36.887889 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 9 12:59:36.891784 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 12:59:36.895119 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 9 12:59:36.900122 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 9 12:59:36.900557 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 9 12:59:36.900639 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 9 12:59:36.904517 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 9 12:59:36.904646 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 12:59:36.905263 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 9 12:59:36.913908 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 12:59:36.919202 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 9 12:59:36.919474 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 9 12:59:36.919547 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 9 12:59:36.919896 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 12:59:36.924155 systemd[1]: Finished ensure-sysext.service. Jul 9 12:59:36.924723 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 9 12:59:36.925087 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 9 12:59:36.929187 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 9 12:59:36.938000 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 9 12:59:36.938125 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 9 12:59:36.941349 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 9 12:59:36.941961 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 9 12:59:36.942700 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 9 12:59:36.943097 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 9 12:59:36.943652 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 9 12:59:36.945097 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 9 12:59:36.945135 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 9 12:59:36.947219 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 9 12:59:36.953110 augenrules[1551]: No rules Jul 9 12:59:36.953489 systemd[1]: audit-rules.service: Deactivated successfully. Jul 9 12:59:36.953785 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 9 12:59:36.964642 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 9 12:59:36.978260 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 9 12:59:36.979672 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 9 12:59:36.988989 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jul 9 12:59:37.009526 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 9 12:59:37.043082 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jul 9 12:59:37.043132 kernel: mousedev: PS/2 mouse device common for all mice Jul 9 12:59:37.054320 kernel: ACPI: button: Power Button [PWRF] Jul 9 12:59:37.099825 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jul 9 12:59:37.104422 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 9 12:59:37.107225 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 9 12:59:37.107402 systemd[1]: Reached target time-set.target - System Time Set. Jul 9 12:59:37.111790 systemd-networkd[1509]: lo: Link UP Jul 9 12:59:37.112525 systemd-networkd[1509]: lo: Gained carrier Jul 9 12:59:37.113867 systemd-networkd[1509]: Enumeration completed Jul 9 12:59:37.113975 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 9 12:59:37.118915 systemd-networkd[1509]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Jul 9 12:59:37.121533 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jul 9 12:59:37.121679 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jul 9 12:59:37.121448 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 9 12:59:37.122465 systemd-networkd[1509]: ens192: Link UP Jul 9 12:59:37.122729 systemd-networkd[1509]: ens192: Gained carrier Jul 9 12:59:37.124461 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 9 12:59:37.129363 systemd-timesyncd[1541]: Network configuration changed, trying to establish connection. Jul 9 12:59:37.140770 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 9 12:59:37.144541 systemd-resolved[1511]: Positive Trust Anchors: Jul 9 12:59:37.144947 systemd-resolved[1511]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 9 12:59:37.145015 systemd-resolved[1511]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 9 12:59:37.148690 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 9 12:59:37.149410 systemd-resolved[1511]: Defaulting to hostname 'linux'. Jul 9 12:59:37.151008 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 9 12:59:37.151177 systemd[1]: Reached target network.target - Network. Jul 9 12:59:37.151456 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 9 12:59:37.151636 systemd[1]: Reached target sysinit.target - System Initialization. Jul 9 12:59:37.151868 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 9 12:59:37.152029 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 9 12:59:37.152197 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jul 9 12:59:37.152462 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 9 12:59:37.152717 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 9 12:59:37.152868 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 9 12:59:37.153136 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 9 12:59:37.153162 systemd[1]: Reached target paths.target - Path Units. Jul 9 12:59:37.153261 systemd[1]: Reached target timers.target - Timer Units. Jul 9 12:59:37.154091 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 9 12:59:37.155815 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 9 12:59:37.157445 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 9 12:59:37.157840 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 9 12:59:37.158059 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 9 12:59:37.160750 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 9 12:59:37.161361 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 9 12:59:37.161872 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 9 12:59:37.162559 systemd[1]: Reached target sockets.target - Socket Units. Jul 9 12:59:37.162759 systemd[1]: Reached target basic.target - Basic System. Jul 9 12:59:37.163007 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 9 12:59:37.163026 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 9 12:59:37.163934 systemd[1]: Starting containerd.service - containerd container runtime... Jul 9 12:59:37.165412 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 9 12:59:37.167031 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 9 12:59:37.169644 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 9 12:59:37.174007 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 9 12:59:37.174137 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 9 12:59:37.175334 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jul 9 12:59:37.177371 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 9 12:59:37.179408 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 9 12:59:37.181867 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 9 12:59:37.183475 jq[1591]: false Jul 9 12:59:37.184454 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 9 12:59:37.190458 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 9 12:59:37.191059 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 9 12:59:37.191570 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 9 12:59:37.195593 systemd[1]: Starting update-engine.service - Update Engine... Jul 9 12:59:37.197613 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 9 12:59:37.199546 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Jul 9 12:59:37.202344 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 9 12:59:37.202679 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 9 12:59:37.202806 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 9 12:59:37.204311 jq[1603]: true Jul 9 12:59:37.216314 extend-filesystems[1592]: Found /dev/sda6 Jul 9 12:59:37.215816 oslogin_cache_refresh[1593]: Refreshing passwd entry cache Jul 9 12:59:37.216688 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Refreshing passwd entry cache Jul 9 12:59:37.221013 update_engine[1600]: I20250709 12:59:37.219290 1600 main.cc:92] Flatcar Update Engine starting Jul 9 12:59:37.222352 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 9 12:59:37.226522 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 9 12:59:37.229864 extend-filesystems[1592]: Found /dev/sda9 Jul 9 12:59:37.230794 extend-filesystems[1592]: Checking size of /dev/sda9 Jul 9 12:59:37.235626 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Failure getting users, quitting Jul 9 12:59:37.235620 oslogin_cache_refresh[1593]: Failure getting users, quitting Jul 9 12:59:37.235782 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 9 12:59:37.235782 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Refreshing group entry cache Jul 9 12:59:37.235632 oslogin_cache_refresh[1593]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 9 12:59:37.235662 oslogin_cache_refresh[1593]: Refreshing group entry cache Jul 9 12:59:37.236438 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Jul 9 12:59:37.238013 extend-filesystems[1592]: Old size kept for /dev/sda9 Jul 9 12:59:37.241431 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Failure getting groups, quitting Jul 9 12:59:37.241431 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 9 12:59:37.241425 oslogin_cache_refresh[1593]: Failure getting groups, quitting Jul 9 12:59:37.241433 oslogin_cache_refresh[1593]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 9 12:59:37.242142 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 9 12:59:37.246523 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 9 12:59:37.247153 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jul 9 12:59:37.247952 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jul 9 12:59:37.253138 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Jul 9 12:59:37.254548 jq[1606]: true Jul 9 12:59:37.264618 (ntainerd)[1624]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 9 12:59:37.277546 tar[1607]: linux-amd64/LICENSE Jul 9 12:59:37.277546 tar[1607]: linux-amd64/helm Jul 9 12:59:37.280496 systemd[1]: motdgen.service: Deactivated successfully. Jul 9 12:59:37.280647 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 9 12:59:37.294384 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Jul 9 12:59:37.310810 unknown[1615]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Jul 9 12:59:37.314572 dbus-daemon[1589]: [system] SELinux support is enabled Jul 9 12:59:37.314667 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 9 12:59:37.316841 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 9 12:59:37.316866 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 9 12:59:37.317128 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 9 12:59:37.317138 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 9 12:59:37.318934 unknown[1615]: Core dump limit set to -1 Jul 9 12:59:37.335680 bash[1655]: Updated "/home/core/.ssh/authorized_keys" Jul 9 12:59:37.336233 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 9 12:59:37.337666 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jul 9 12:59:37.340732 systemd[1]: Started update-engine.service - Update Engine. Jul 9 12:59:37.343380 update_engine[1600]: I20250709 12:59:37.340770 1600 update_check_scheduler.cc:74] Next update check in 5m57s Jul 9 12:59:37.373411 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 9 12:59:37.381916 systemd-logind[1599]: New seat seat0. Jul 9 12:59:37.383694 systemd[1]: Started systemd-logind.service - User Login Management. Jul 9 12:59:37.414313 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Jul 9 12:59:37.457841 locksmithd[1658]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 9 12:59:37.590137 containerd[1624]: time="2025-07-09T12:59:37Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 9 12:59:37.591148 containerd[1624]: time="2025-07-09T12:59:37.591131484Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Jul 9 12:59:37.595125 (udev-worker)[1487]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jul 9 12:59:37.606844 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 12:59:37.609160 containerd[1624]: time="2025-07-09T12:59:37.609136720Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.567µs" Jul 9 12:59:37.609203 containerd[1624]: time="2025-07-09T12:59:37.609158288Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 9 12:59:37.609203 containerd[1624]: time="2025-07-09T12:59:37.609173539Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 9 12:59:37.609274 containerd[1624]: time="2025-07-09T12:59:37.609262965Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 9 12:59:37.609292 containerd[1624]: time="2025-07-09T12:59:37.609275057Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 9 12:59:37.609344 containerd[1624]: time="2025-07-09T12:59:37.609290154Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 9 12:59:37.616753 systemd-logind[1599]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jul 9 12:59:37.625220 containerd[1624]: time="2025-07-09T12:59:37.616296596Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625220 containerd[1624]: time="2025-07-09T12:59:37.625193743Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625391 containerd[1624]: time="2025-07-09T12:59:37.625374809Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625391 containerd[1624]: time="2025-07-09T12:59:37.625388991Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625439 containerd[1624]: time="2025-07-09T12:59:37.625397072Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625439 containerd[1624]: time="2025-07-09T12:59:37.625402084Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625470 containerd[1624]: time="2025-07-09T12:59:37.625444847Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625585 containerd[1624]: time="2025-07-09T12:59:37.625573106Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625604 containerd[1624]: time="2025-07-09T12:59:37.625591507Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 9 12:59:37.625604 containerd[1624]: time="2025-07-09T12:59:37.625597566Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 9 12:59:37.625641 containerd[1624]: time="2025-07-09T12:59:37.625614616Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 9 12:59:37.625760 containerd[1624]: time="2025-07-09T12:59:37.625745670Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 9 12:59:37.625786 containerd[1624]: time="2025-07-09T12:59:37.625778648Z" level=info msg="metadata content store policy set" policy=shared Jul 9 12:59:37.630733 systemd-logind[1599]: Watching system buttons on /dev/input/event2 (Power Button) Jul 9 12:59:37.637498 containerd[1624]: time="2025-07-09T12:59:37.637471104Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 9 12:59:37.637549 containerd[1624]: time="2025-07-09T12:59:37.637511821Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 9 12:59:37.637549 containerd[1624]: time="2025-07-09T12:59:37.637522819Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 9 12:59:37.637549 containerd[1624]: time="2025-07-09T12:59:37.637531738Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 9 12:59:37.637549 containerd[1624]: time="2025-07-09T12:59:37.637539865Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 9 12:59:37.637549 containerd[1624]: time="2025-07-09T12:59:37.637545990Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 9 12:59:37.637628 containerd[1624]: time="2025-07-09T12:59:37.637554045Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 9 12:59:37.637628 containerd[1624]: time="2025-07-09T12:59:37.637577118Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 9 12:59:37.637628 containerd[1624]: time="2025-07-09T12:59:37.637584643Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 9 12:59:37.637628 containerd[1624]: time="2025-07-09T12:59:37.637589944Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 9 12:59:37.637628 containerd[1624]: time="2025-07-09T12:59:37.637594879Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 9 12:59:37.637628 containerd[1624]: time="2025-07-09T12:59:37.637602895Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 9 12:59:37.637705 containerd[1624]: time="2025-07-09T12:59:37.637668781Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 9 12:59:37.637705 containerd[1624]: time="2025-07-09T12:59:37.637680312Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 9 12:59:37.637705 containerd[1624]: time="2025-07-09T12:59:37.637688200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 9 12:59:37.637705 containerd[1624]: time="2025-07-09T12:59:37.637694017Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 9 12:59:37.637705 containerd[1624]: time="2025-07-09T12:59:37.637699706Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 9 12:59:37.637705 containerd[1624]: time="2025-07-09T12:59:37.637705124Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 9 12:59:37.637819 containerd[1624]: time="2025-07-09T12:59:37.637711726Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 9 12:59:37.637819 containerd[1624]: time="2025-07-09T12:59:37.637716850Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 9 12:59:37.637819 containerd[1624]: time="2025-07-09T12:59:37.637722776Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 9 12:59:37.637819 containerd[1624]: time="2025-07-09T12:59:37.637733350Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 9 12:59:37.637819 containerd[1624]: time="2025-07-09T12:59:37.637740089Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 9 12:59:37.637819 containerd[1624]: time="2025-07-09T12:59:37.637800903Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 9 12:59:37.637819 containerd[1624]: time="2025-07-09T12:59:37.637809943Z" level=info msg="Start snapshots syncer" Jul 9 12:59:37.637912 containerd[1624]: time="2025-07-09T12:59:37.637828462Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 9 12:59:37.638080 containerd[1624]: time="2025-07-09T12:59:37.638044684Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 9 12:59:37.638080 containerd[1624]: time="2025-07-09T12:59:37.638086877Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638125933Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638195630Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638208185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638214135Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638219154Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638224859Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638234838Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638241911Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638254684Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638261008Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638267044Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638284029Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638291890Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 9 12:59:37.638440 containerd[1624]: time="2025-07-09T12:59:37.638296717Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 9 12:59:37.641394 containerd[1624]: time="2025-07-09T12:59:37.641361197Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 9 12:59:37.641394 containerd[1624]: time="2025-07-09T12:59:37.641383900Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 9 12:59:37.641394 containerd[1624]: time="2025-07-09T12:59:37.641395055Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 9 12:59:37.641525 containerd[1624]: time="2025-07-09T12:59:37.641402385Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 9 12:59:37.641525 containerd[1624]: time="2025-07-09T12:59:37.641413569Z" level=info msg="runtime interface created" Jul 9 12:59:37.641525 containerd[1624]: time="2025-07-09T12:59:37.641416862Z" level=info msg="created NRI interface" Jul 9 12:59:37.641525 containerd[1624]: time="2025-07-09T12:59:37.641421338Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 9 12:59:37.641525 containerd[1624]: time="2025-07-09T12:59:37.641444870Z" level=info msg="Connect containerd service" Jul 9 12:59:37.641525 containerd[1624]: time="2025-07-09T12:59:37.641470025Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 9 12:59:37.641942 containerd[1624]: time="2025-07-09T12:59:37.641920847Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 9 12:59:37.713013 sshd_keygen[1638]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 9 12:59:37.766127 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 9 12:59:37.768898 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 9 12:59:37.791996 systemd[1]: issuegen.service: Deactivated successfully. Jul 9 12:59:37.792513 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 9 12:59:37.802770 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 12:59:37.806461 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 9 12:59:37.826706 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 9 12:59:37.829463 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 9 12:59:37.831950 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jul 9 12:59:37.832200 systemd[1]: Reached target getty.target - Login Prompts. Jul 9 12:59:37.856532 containerd[1624]: time="2025-07-09T12:59:37.856510913Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 9 12:59:37.856817 containerd[1624]: time="2025-07-09T12:59:37.856664479Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 9 12:59:37.856817 containerd[1624]: time="2025-07-09T12:59:37.856685181Z" level=info msg="Start subscribing containerd event" Jul 9 12:59:37.856817 containerd[1624]: time="2025-07-09T12:59:37.856750145Z" level=info msg="Start recovering state" Jul 9 12:59:37.856897 containerd[1624]: time="2025-07-09T12:59:37.856889812Z" level=info msg="Start event monitor" Jul 9 12:59:37.856930 containerd[1624]: time="2025-07-09T12:59:37.856923989Z" level=info msg="Start cni network conf syncer for default" Jul 9 12:59:37.859740 containerd[1624]: time="2025-07-09T12:59:37.859729620Z" level=info msg="Start streaming server" Jul 9 12:59:37.859792 containerd[1624]: time="2025-07-09T12:59:37.859771340Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 9 12:59:37.859859 containerd[1624]: time="2025-07-09T12:59:37.859818928Z" level=info msg="runtime interface starting up..." Jul 9 12:59:37.859859 containerd[1624]: time="2025-07-09T12:59:37.859826008Z" level=info msg="starting plugins..." Jul 9 12:59:37.859859 containerd[1624]: time="2025-07-09T12:59:37.859837136Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 9 12:59:37.860036 systemd[1]: Started containerd.service - containerd container runtime. Jul 9 12:59:37.864443 tar[1607]: linux-amd64/README.md Jul 9 12:59:37.865010 containerd[1624]: time="2025-07-09T12:59:37.864993279Z" level=info msg="containerd successfully booted in 0.275431s" Jul 9 12:59:37.887273 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 9 12:59:39.164470 systemd-networkd[1509]: ens192: Gained IPv6LL Jul 9 12:59:39.164783 systemd-timesyncd[1541]: Network configuration changed, trying to establish connection. Jul 9 12:59:39.165970 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 9 12:59:39.166366 systemd[1]: Reached target network-online.target - Network is Online. Jul 9 12:59:39.167511 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Jul 9 12:59:39.169390 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 12:59:39.171437 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 9 12:59:39.194186 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 9 12:59:39.203644 systemd[1]: coreos-metadata.service: Deactivated successfully. Jul 9 12:59:39.203784 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Jul 9 12:59:39.204391 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 9 12:59:40.244859 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 12:59:40.245182 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 9 12:59:40.245696 systemd[1]: Startup finished in 2.710s (kernel) + 6.938s (initrd) + 5.764s (userspace) = 15.413s. Jul 9 12:59:40.252620 (kubelet)[1807]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 12:59:40.276832 login[1771]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jul 9 12:59:40.278073 login[1772]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jul 9 12:59:40.285895 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 9 12:59:40.286814 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 9 12:59:40.291681 systemd-logind[1599]: New session 1 of user core. Jul 9 12:59:40.295277 systemd-logind[1599]: New session 2 of user core. Jul 9 12:59:40.305266 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 9 12:59:40.308479 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 9 12:59:40.328653 (systemd)[1814]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 9 12:59:40.330582 systemd-logind[1599]: New session c1 of user core. Jul 9 12:59:40.422651 systemd[1814]: Queued start job for default target default.target. Jul 9 12:59:40.430165 systemd[1814]: Created slice app.slice - User Application Slice. Jul 9 12:59:40.430183 systemd[1814]: Reached target paths.target - Paths. Jul 9 12:59:40.430208 systemd[1814]: Reached target timers.target - Timers. Jul 9 12:59:40.430884 systemd[1814]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 9 12:59:40.441455 systemd[1814]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 9 12:59:40.441523 systemd[1814]: Reached target sockets.target - Sockets. Jul 9 12:59:40.441557 systemd[1814]: Reached target basic.target - Basic System. Jul 9 12:59:40.441578 systemd[1814]: Reached target default.target - Main User Target. Jul 9 12:59:40.441593 systemd[1814]: Startup finished in 107ms. Jul 9 12:59:40.441630 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 9 12:59:40.448384 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 9 12:59:40.448987 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 9 12:59:40.879962 kubelet[1807]: E0709 12:59:40.879929 1807 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 12:59:40.881436 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 12:59:40.881581 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 12:59:40.881944 systemd[1]: kubelet.service: Consumed 684ms CPU time, 263.9M memory peak. Jul 9 12:59:40.939650 systemd-timesyncd[1541]: Network configuration changed, trying to establish connection. Jul 9 12:59:51.132033 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 9 12:59:51.133882 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 12:59:51.562865 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 12:59:51.571586 (kubelet)[1859]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 12:59:51.620885 kubelet[1859]: E0709 12:59:51.620849 1859 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 12:59:51.623462 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 12:59:51.623616 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 12:59:51.624015 systemd[1]: kubelet.service: Consumed 108ms CPU time, 108.6M memory peak. Jul 9 13:00:01.874029 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 9 13:00:01.875656 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 13:00:02.325863 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 13:00:02.328759 (kubelet)[1874]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 13:00:02.357226 kubelet[1874]: E0709 13:00:02.357191 1874 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 13:00:02.358627 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 13:00:02.358717 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 13:00:02.359053 systemd[1]: kubelet.service: Consumed 97ms CPU time, 110.4M memory peak. Jul 9 13:00:07.408042 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 9 13:00:07.409057 systemd[1]: Started sshd@0-139.178.70.105:22-139.178.68.195:34790.service - OpenSSH per-connection server daemon (139.178.68.195:34790). Jul 9 13:00:07.496476 sshd[1882]: Accepted publickey for core from 139.178.68.195 port 34790 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:00:07.497296 sshd-session[1882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:00:07.501032 systemd-logind[1599]: New session 3 of user core. Jul 9 13:00:07.516517 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 9 13:00:07.574441 systemd[1]: Started sshd@1-139.178.70.105:22-139.178.68.195:34796.service - OpenSSH per-connection server daemon (139.178.68.195:34796). Jul 9 13:00:07.609269 sshd[1888]: Accepted publickey for core from 139.178.68.195 port 34796 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:00:07.609921 sshd-session[1888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:00:07.612588 systemd-logind[1599]: New session 4 of user core. Jul 9 13:00:07.620373 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 9 13:00:07.667924 sshd[1891]: Connection closed by 139.178.68.195 port 34796 Jul 9 13:00:07.668207 sshd-session[1888]: pam_unix(sshd:session): session closed for user core Jul 9 13:00:07.677148 systemd[1]: sshd@1-139.178.70.105:22-139.178.68.195:34796.service: Deactivated successfully. Jul 9 13:00:07.678299 systemd[1]: session-4.scope: Deactivated successfully. Jul 9 13:00:07.678881 systemd-logind[1599]: Session 4 logged out. Waiting for processes to exit. Jul 9 13:00:07.680437 systemd[1]: Started sshd@2-139.178.70.105:22-139.178.68.195:34802.service - OpenSSH per-connection server daemon (139.178.68.195:34802). Jul 9 13:00:07.680993 systemd-logind[1599]: Removed session 4. Jul 9 13:00:07.716034 sshd[1897]: Accepted publickey for core from 139.178.68.195 port 34802 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:00:07.716874 sshd-session[1897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:00:07.719896 systemd-logind[1599]: New session 5 of user core. Jul 9 13:00:07.727461 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 9 13:00:07.774086 sshd[1900]: Connection closed by 139.178.68.195 port 34802 Jul 9 13:00:07.773447 sshd-session[1897]: pam_unix(sshd:session): session closed for user core Jul 9 13:00:07.783496 systemd[1]: sshd@2-139.178.70.105:22-139.178.68.195:34802.service: Deactivated successfully. Jul 9 13:00:07.784341 systemd[1]: session-5.scope: Deactivated successfully. Jul 9 13:00:07.784777 systemd-logind[1599]: Session 5 logged out. Waiting for processes to exit. Jul 9 13:00:07.785964 systemd[1]: Started sshd@3-139.178.70.105:22-139.178.68.195:34814.service - OpenSSH per-connection server daemon (139.178.68.195:34814). Jul 9 13:00:07.787499 systemd-logind[1599]: Removed session 5. Jul 9 13:00:07.819372 sshd[1906]: Accepted publickey for core from 139.178.68.195 port 34814 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:00:07.820142 sshd-session[1906]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:00:07.822739 systemd-logind[1599]: New session 6 of user core. Jul 9 13:00:07.833680 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 9 13:00:07.882133 sshd[1909]: Connection closed by 139.178.68.195 port 34814 Jul 9 13:00:07.882808 sshd-session[1906]: pam_unix(sshd:session): session closed for user core Jul 9 13:00:07.888976 systemd[1]: sshd@3-139.178.70.105:22-139.178.68.195:34814.service: Deactivated successfully. Jul 9 13:00:07.889860 systemd[1]: session-6.scope: Deactivated successfully. Jul 9 13:00:07.890277 systemd-logind[1599]: Session 6 logged out. Waiting for processes to exit. Jul 9 13:00:07.893463 systemd[1]: Started sshd@4-139.178.70.105:22-139.178.68.195:34816.service - OpenSSH per-connection server daemon (139.178.68.195:34816). Jul 9 13:00:07.894089 systemd-logind[1599]: Removed session 6. Jul 9 13:00:07.933258 sshd[1915]: Accepted publickey for core from 139.178.68.195 port 34816 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:00:07.934054 sshd-session[1915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:00:07.937344 systemd-logind[1599]: New session 7 of user core. Jul 9 13:00:07.943408 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 9 13:00:07.999918 sudo[1919]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 9 13:00:08.000101 sudo[1919]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 9 13:00:08.014608 sudo[1919]: pam_unix(sudo:session): session closed for user root Jul 9 13:00:08.015859 sshd[1918]: Connection closed by 139.178.68.195 port 34816 Jul 9 13:00:08.015782 sshd-session[1915]: pam_unix(sshd:session): session closed for user core Jul 9 13:00:08.021986 systemd[1]: sshd@4-139.178.70.105:22-139.178.68.195:34816.service: Deactivated successfully. Jul 9 13:00:08.023132 systemd[1]: session-7.scope: Deactivated successfully. Jul 9 13:00:08.023791 systemd-logind[1599]: Session 7 logged out. Waiting for processes to exit. Jul 9 13:00:08.025919 systemd[1]: Started sshd@5-139.178.70.105:22-139.178.68.195:34830.service - OpenSSH per-connection server daemon (139.178.68.195:34830). Jul 9 13:00:08.026725 systemd-logind[1599]: Removed session 7. Jul 9 13:00:08.061911 sshd[1925]: Accepted publickey for core from 139.178.68.195 port 34830 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:00:08.062887 sshd-session[1925]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:00:08.066287 systemd-logind[1599]: New session 8 of user core. Jul 9 13:00:08.076428 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 9 13:00:08.127290 sudo[1930]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 9 13:00:08.127508 sudo[1930]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 9 13:00:08.130392 sudo[1930]: pam_unix(sudo:session): session closed for user root Jul 9 13:00:08.134066 sudo[1929]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 9 13:00:08.134463 sudo[1929]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 9 13:00:08.141873 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 9 13:00:08.168003 augenrules[1952]: No rules Jul 9 13:00:08.168723 systemd[1]: audit-rules.service: Deactivated successfully. Jul 9 13:00:08.168978 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 9 13:00:08.169717 sudo[1929]: pam_unix(sudo:session): session closed for user root Jul 9 13:00:08.170460 sshd[1928]: Connection closed by 139.178.68.195 port 34830 Jul 9 13:00:08.170658 sshd-session[1925]: pam_unix(sshd:session): session closed for user core Jul 9 13:00:08.175487 systemd[1]: sshd@5-139.178.70.105:22-139.178.68.195:34830.service: Deactivated successfully. Jul 9 13:00:08.176318 systemd[1]: session-8.scope: Deactivated successfully. Jul 9 13:00:08.176772 systemd-logind[1599]: Session 8 logged out. Waiting for processes to exit. Jul 9 13:00:08.177859 systemd[1]: Started sshd@6-139.178.70.105:22-139.178.68.195:34832.service - OpenSSH per-connection server daemon (139.178.68.195:34832). Jul 9 13:00:08.179918 systemd-logind[1599]: Removed session 8. Jul 9 13:00:08.215954 sshd[1961]: Accepted publickey for core from 139.178.68.195 port 34832 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:00:08.216664 sshd-session[1961]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:00:08.219028 systemd-logind[1599]: New session 9 of user core. Jul 9 13:00:08.226502 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 9 13:00:08.275496 sudo[1965]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 9 13:00:08.275921 sudo[1965]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 9 13:00:08.569904 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 9 13:00:08.580491 (dockerd)[1983]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 9 13:00:08.789888 dockerd[1983]: time="2025-07-09T13:00:08.789853117Z" level=info msg="Starting up" Jul 9 13:00:08.790222 dockerd[1983]: time="2025-07-09T13:00:08.790209941Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 9 13:00:08.795983 dockerd[1983]: time="2025-07-09T13:00:08.795958124Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jul 9 13:00:08.819624 dockerd[1983]: time="2025-07-09T13:00:08.819595267Z" level=info msg="Loading containers: start." Jul 9 13:00:08.828375 kernel: Initializing XFRM netlink socket Jul 9 13:00:08.956372 systemd-timesyncd[1541]: Network configuration changed, trying to establish connection. Jul 9 13:00:08.979775 systemd-networkd[1509]: docker0: Link UP Jul 9 13:00:08.983689 dockerd[1983]: time="2025-07-09T13:00:08.983655780Z" level=info msg="Loading containers: done." Jul 9 13:00:08.992245 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck252141461-merged.mount: Deactivated successfully. Jul 9 13:00:08.993342 dockerd[1983]: time="2025-07-09T13:00:08.993319396Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 9 13:00:08.993390 dockerd[1983]: time="2025-07-09T13:00:08.993377253Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jul 9 13:00:08.993431 dockerd[1983]: time="2025-07-09T13:00:08.993420099Z" level=info msg="Initializing buildkit" Jul 9 13:00:09.003676 dockerd[1983]: time="2025-07-09T13:00:09.003649013Z" level=info msg="Completed buildkit initialization" Jul 9 13:00:09.006160 dockerd[1983]: time="2025-07-09T13:00:09.006136741Z" level=info msg="Daemon has completed initialization" Jul 9 13:00:09.006521 dockerd[1983]: time="2025-07-09T13:00:09.006169760Z" level=info msg="API listen on /run/docker.sock" Jul 9 13:00:09.006315 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 9 13:01:35.086118 systemd-resolved[1511]: Clock change detected. Flushing caches. Jul 9 13:01:35.086746 systemd-timesyncd[1541]: Contacted time server 185.122.164.153:123 (2.flatcar.pool.ntp.org). Jul 9 13:01:35.086793 systemd-timesyncd[1541]: Initial clock synchronization to Wed 2025-07-09 13:01:35.086009 UTC. Jul 9 13:01:35.872614 containerd[1624]: time="2025-07-09T13:01:35.872547747Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.6\"" Jul 9 13:01:36.396003 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount551394360.mount: Deactivated successfully. Jul 9 13:01:37.819201 containerd[1624]: time="2025-07-09T13:01:37.819166496Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:37.820049 containerd[1624]: time="2025-07-09T13:01:37.819983514Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.6: active requests=0, bytes read=28799045" Jul 9 13:01:37.820413 containerd[1624]: time="2025-07-09T13:01:37.820395884Z" level=info msg="ImageCreate event name:\"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:37.821603 containerd[1624]: time="2025-07-09T13:01:37.821575837Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0f5764551d7de4ef70489ff8a70f32df7dea00701f5545af089b60bc5ede4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:37.822486 containerd[1624]: time="2025-07-09T13:01:37.822103556Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.6\" with image id \"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.6\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0f5764551d7de4ef70489ff8a70f32df7dea00701f5545af089b60bc5ede4f6f\", size \"28795845\" in 1.949532261s" Jul 9 13:01:37.822486 containerd[1624]: time="2025-07-09T13:01:37.822123521Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.6\" returns image reference \"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\"" Jul 9 13:01:37.822723 containerd[1624]: time="2025-07-09T13:01:37.822692183Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.6\"" Jul 9 13:01:38.453991 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 9 13:01:38.456776 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 13:01:39.198508 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 13:01:39.201055 (kubelet)[2256]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 13:01:39.231625 kubelet[2256]: E0709 13:01:39.231592 2256 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 13:01:39.232969 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 13:01:39.233106 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 13:01:39.233436 systemd[1]: kubelet.service: Consumed 99ms CPU time, 110.5M memory peak. Jul 9 13:01:40.846865 containerd[1624]: time="2025-07-09T13:01:40.846833351Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:40.854378 containerd[1624]: time="2025-07-09T13:01:40.854355003Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.6: active requests=0, bytes read=24783912" Jul 9 13:01:40.864908 containerd[1624]: time="2025-07-09T13:01:40.864886285Z" level=info msg="ImageCreate event name:\"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:40.878742 containerd[1624]: time="2025-07-09T13:01:40.878709593Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3425f29c94a77d74cb89f38413e6274277dcf5e2bc7ab6ae953578a91e9e8356\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:40.879252 containerd[1624]: time="2025-07-09T13:01:40.879174166Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.6\" with image id \"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.6\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3425f29c94a77d74cb89f38413e6274277dcf5e2bc7ab6ae953578a91e9e8356\", size \"26385746\" in 3.056379507s" Jul 9 13:01:40.879252 containerd[1624]: time="2025-07-09T13:01:40.879190197Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.6\" returns image reference \"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\"" Jul 9 13:01:40.879486 containerd[1624]: time="2025-07-09T13:01:40.879469406Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.6\"" Jul 9 13:01:42.336507 containerd[1624]: time="2025-07-09T13:01:42.336463737Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:42.341138 containerd[1624]: time="2025-07-09T13:01:42.341115359Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.6: active requests=0, bytes read=19176916" Jul 9 13:01:42.345993 containerd[1624]: time="2025-07-09T13:01:42.345963772Z" level=info msg="ImageCreate event name:\"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:42.350210 containerd[1624]: time="2025-07-09T13:01:42.350185165Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:130f633cbd1d70e2f4655350153cb3fc469f4d5a6310b4f0b49d93fb2ba2132b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:42.350690 containerd[1624]: time="2025-07-09T13:01:42.350557076Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.6\" with image id \"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.6\", repo digest \"registry.k8s.io/kube-scheduler@sha256:130f633cbd1d70e2f4655350153cb3fc469f4d5a6310b4f0b49d93fb2ba2132b\", size \"20778768\" in 1.471000617s" Jul 9 13:01:42.350690 containerd[1624]: time="2025-07-09T13:01:42.350577494Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.6\" returns image reference \"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\"" Jul 9 13:01:42.350918 containerd[1624]: time="2025-07-09T13:01:42.350899558Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.6\"" Jul 9 13:01:43.748436 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1793435503.mount: Deactivated successfully. Jul 9 13:01:44.174819 containerd[1624]: time="2025-07-09T13:01:44.174568830Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:44.182024 containerd[1624]: time="2025-07-09T13:01:44.182007081Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.6: active requests=0, bytes read=30895363" Jul 9 13:01:44.186253 containerd[1624]: time="2025-07-09T13:01:44.186236995Z" level=info msg="ImageCreate event name:\"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:44.188968 containerd[1624]: time="2025-07-09T13:01:44.188933102Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:b13d9da413b983d130bf090b83fce12e1ccc704e95f366da743c18e964d9d7e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:44.189488 containerd[1624]: time="2025-07-09T13:01:44.189390360Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.6\" with image id \"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\", repo tag \"registry.k8s.io/kube-proxy:v1.32.6\", repo digest \"registry.k8s.io/kube-proxy@sha256:b13d9da413b983d130bf090b83fce12e1ccc704e95f366da743c18e964d9d7e9\", size \"30894382\" in 1.838472199s" Jul 9 13:01:44.189488 containerd[1624]: time="2025-07-09T13:01:44.189410170Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.6\" returns image reference \"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\"" Jul 9 13:01:44.190102 containerd[1624]: time="2025-07-09T13:01:44.190054051Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 9 13:01:44.815600 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1673082538.mount: Deactivated successfully. Jul 9 13:01:45.911812 containerd[1624]: time="2025-07-09T13:01:45.911773068Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:45.920690 containerd[1624]: time="2025-07-09T13:01:45.920592558Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Jul 9 13:01:45.928968 containerd[1624]: time="2025-07-09T13:01:45.928938020Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:45.936762 containerd[1624]: time="2025-07-09T13:01:45.936723506Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:45.937220 containerd[1624]: time="2025-07-09T13:01:45.937103256Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.746930795s" Jul 9 13:01:45.937220 containerd[1624]: time="2025-07-09T13:01:45.937124786Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jul 9 13:01:45.937444 containerd[1624]: time="2025-07-09T13:01:45.937427349Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 9 13:01:46.381259 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1988963644.mount: Deactivated successfully. Jul 9 13:01:46.383232 containerd[1624]: time="2025-07-09T13:01:46.383208702Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 9 13:01:46.383880 containerd[1624]: time="2025-07-09T13:01:46.383862312Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Jul 9 13:01:46.384213 containerd[1624]: time="2025-07-09T13:01:46.384197919Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 9 13:01:46.385340 containerd[1624]: time="2025-07-09T13:01:46.385318140Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 9 13:01:46.385770 containerd[1624]: time="2025-07-09T13:01:46.385706371Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 448.261298ms" Jul 9 13:01:46.385770 containerd[1624]: time="2025-07-09T13:01:46.385723005Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jul 9 13:01:46.386191 containerd[1624]: time="2025-07-09T13:01:46.385938978Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jul 9 13:01:46.867062 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3583881196.mount: Deactivated successfully. Jul 9 13:01:48.845747 update_engine[1600]: I20250709 13:01:48.845702 1600 update_attempter.cc:509] Updating boot flags... Jul 9 13:01:48.870003 containerd[1624]: time="2025-07-09T13:01:48.869555754Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:48.876924 containerd[1624]: time="2025-07-09T13:01:48.876895402Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551360" Jul 9 13:01:48.883981 containerd[1624]: time="2025-07-09T13:01:48.883948773Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:48.889428 containerd[1624]: time="2025-07-09T13:01:48.889400283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:01:48.890714 containerd[1624]: time="2025-07-09T13:01:48.890413198Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.504180946s" Jul 9 13:01:48.890714 containerd[1624]: time="2025-07-09T13:01:48.890437988Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jul 9 13:01:49.454350 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jul 9 13:01:49.456537 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 13:01:50.407470 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 13:01:50.410425 (kubelet)[2435]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 13:01:50.448435 kubelet[2435]: E0709 13:01:50.448401 2435 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 13:01:50.449812 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 13:01:50.449896 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 13:01:50.450158 systemd[1]: kubelet.service: Consumed 100ms CPU time, 108M memory peak. Jul 9 13:01:51.140640 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 13:01:51.140749 systemd[1]: kubelet.service: Consumed 100ms CPU time, 108M memory peak. Jul 9 13:01:51.142071 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 13:01:51.161034 systemd[1]: Reload requested from client PID 2449 ('systemctl') (unit session-9.scope)... Jul 9 13:01:51.161127 systemd[1]: Reloading... Jul 9 13:01:51.239756 zram_generator::config[2492]: No configuration found. Jul 9 13:01:51.293413 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 9 13:01:51.301360 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 9 13:01:51.369995 systemd[1]: Reloading finished in 208 ms. Jul 9 13:01:51.392704 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 9 13:01:51.392766 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 9 13:01:51.392964 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 13:01:51.394183 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 13:01:51.760969 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 13:01:51.763418 (kubelet)[2560]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 9 13:01:51.792493 kubelet[2560]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 9 13:01:51.792493 kubelet[2560]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 9 13:01:51.792493 kubelet[2560]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 9 13:01:51.793027 kubelet[2560]: I0709 13:01:51.792527 2560 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 9 13:01:52.028864 kubelet[2560]: I0709 13:01:52.028804 2560 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jul 9 13:01:52.028864 kubelet[2560]: I0709 13:01:52.028822 2560 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 9 13:01:52.029026 kubelet[2560]: I0709 13:01:52.028969 2560 server.go:954] "Client rotation is on, will bootstrap in background" Jul 9 13:01:52.056682 kubelet[2560]: I0709 13:01:52.056662 2560 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 9 13:01:52.060235 kubelet[2560]: E0709 13:01:52.060203 2560 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.105:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" Jul 9 13:01:52.067785 kubelet[2560]: I0709 13:01:52.067773 2560 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 9 13:01:52.072468 kubelet[2560]: I0709 13:01:52.072243 2560 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 9 13:01:52.074505 kubelet[2560]: I0709 13:01:52.074487 2560 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 9 13:01:52.074638 kubelet[2560]: I0709 13:01:52.074543 2560 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 9 13:01:52.076161 kubelet[2560]: I0709 13:01:52.076151 2560 topology_manager.go:138] "Creating topology manager with none policy" Jul 9 13:01:52.076212 kubelet[2560]: I0709 13:01:52.076206 2560 container_manager_linux.go:304] "Creating device plugin manager" Jul 9 13:01:52.077136 kubelet[2560]: I0709 13:01:52.077126 2560 state_mem.go:36] "Initialized new in-memory state store" Jul 9 13:01:52.080340 kubelet[2560]: I0709 13:01:52.080332 2560 kubelet.go:446] "Attempting to sync node with API server" Jul 9 13:01:52.080392 kubelet[2560]: I0709 13:01:52.080386 2560 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 9 13:01:52.081798 kubelet[2560]: I0709 13:01:52.081743 2560 kubelet.go:352] "Adding apiserver pod source" Jul 9 13:01:52.081798 kubelet[2560]: I0709 13:01:52.081753 2560 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 9 13:01:52.089808 kubelet[2560]: I0709 13:01:52.089793 2560 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 9 13:01:52.092229 kubelet[2560]: W0709 13:01:52.092011 2560 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused Jul 9 13:01:52.092229 kubelet[2560]: E0709 13:01:52.092048 2560 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" Jul 9 13:01:52.092229 kubelet[2560]: W0709 13:01:52.092082 2560 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.105:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused Jul 9 13:01:52.092229 kubelet[2560]: E0709 13:01:52.092100 2560 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.105:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" Jul 9 13:01:52.092229 kubelet[2560]: I0709 13:01:52.092149 2560 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 9 13:01:52.092904 kubelet[2560]: W0709 13:01:52.092895 2560 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 9 13:01:52.093233 kubelet[2560]: I0709 13:01:52.093225 2560 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 9 13:01:52.093281 kubelet[2560]: I0709 13:01:52.093277 2560 server.go:1287] "Started kubelet" Jul 9 13:01:52.094035 kubelet[2560]: I0709 13:01:52.094022 2560 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jul 9 13:01:52.095192 kubelet[2560]: I0709 13:01:52.095184 2560 server.go:479] "Adding debug handlers to kubelet server" Jul 9 13:01:52.095448 kubelet[2560]: I0709 13:01:52.095433 2560 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 9 13:01:52.095772 kubelet[2560]: I0709 13:01:52.095750 2560 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 9 13:01:52.095904 kubelet[2560]: I0709 13:01:52.095897 2560 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 9 13:01:52.099085 kubelet[2560]: I0709 13:01:52.099071 2560 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 9 13:01:52.100049 kubelet[2560]: E0709 13:01:52.100034 2560 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jul 9 13:01:52.101764 kubelet[2560]: I0709 13:01:52.101388 2560 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 9 13:01:52.101764 kubelet[2560]: I0709 13:01:52.101562 2560 reconciler.go:26] "Reconciler: start to sync state" Jul 9 13:01:52.103259 kubelet[2560]: I0709 13:01:52.103244 2560 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 9 13:01:52.108190 kubelet[2560]: E0709 13:01:52.104935 2560 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.105:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.105:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.185096d89a51a619 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-07-09 13:01:52.093267481 +0000 UTC m=+0.327560176,LastTimestamp:2025-07-09 13:01:52.093267481 +0000 UTC m=+0.327560176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jul 9 13:01:52.108744 kubelet[2560]: I0709 13:01:52.108619 2560 factory.go:221] Registration of the systemd container factory successfully Jul 9 13:01:52.108744 kubelet[2560]: I0709 13:01:52.108668 2560 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 9 13:01:52.110213 kubelet[2560]: I0709 13:01:52.110188 2560 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 9 13:01:52.111068 kubelet[2560]: E0709 13:01:52.110785 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.105:6443: connect: connection refused" interval="200ms" Jul 9 13:01:52.111253 kubelet[2560]: I0709 13:01:52.111242 2560 factory.go:221] Registration of the containerd container factory successfully Jul 9 13:01:52.112318 kubelet[2560]: I0709 13:01:52.112304 2560 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 9 13:01:52.112318 kubelet[2560]: I0709 13:01:52.112315 2560 status_manager.go:227] "Starting to sync pod status with apiserver" Jul 9 13:01:52.112369 kubelet[2560]: I0709 13:01:52.112326 2560 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 9 13:01:52.112369 kubelet[2560]: I0709 13:01:52.112330 2560 kubelet.go:2382] "Starting kubelet main sync loop" Jul 9 13:01:52.112369 kubelet[2560]: E0709 13:01:52.112351 2560 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 9 13:01:52.114689 kubelet[2560]: W0709 13:01:52.114654 2560 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused Jul 9 13:01:52.114727 kubelet[2560]: E0709 13:01:52.114697 2560 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" Jul 9 13:01:52.118661 kubelet[2560]: W0709 13:01:52.118643 2560 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused Jul 9 13:01:52.118765 kubelet[2560]: E0709 13:01:52.118756 2560 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" Jul 9 13:01:52.136164 kubelet[2560]: E0709 13:01:52.136148 2560 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 9 13:01:52.138376 kubelet[2560]: I0709 13:01:52.138328 2560 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 9 13:01:52.138407 kubelet[2560]: I0709 13:01:52.138379 2560 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 9 13:01:52.138407 kubelet[2560]: I0709 13:01:52.138388 2560 state_mem.go:36] "Initialized new in-memory state store" Jul 9 13:01:52.139389 kubelet[2560]: I0709 13:01:52.139378 2560 policy_none.go:49] "None policy: Start" Jul 9 13:01:52.139389 kubelet[2560]: I0709 13:01:52.139388 2560 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 9 13:01:52.139431 kubelet[2560]: I0709 13:01:52.139394 2560 state_mem.go:35] "Initializing new in-memory state store" Jul 9 13:01:52.142739 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 9 13:01:52.151883 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 9 13:01:52.154119 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 9 13:01:52.164097 kubelet[2560]: I0709 13:01:52.164080 2560 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 9 13:01:52.164193 kubelet[2560]: I0709 13:01:52.164183 2560 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 9 13:01:52.164221 kubelet[2560]: I0709 13:01:52.164192 2560 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 9 13:01:52.164880 kubelet[2560]: I0709 13:01:52.164821 2560 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 9 13:01:52.165887 kubelet[2560]: E0709 13:01:52.165875 2560 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 9 13:01:52.165927 kubelet[2560]: E0709 13:01:52.165896 2560 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jul 9 13:01:52.229376 systemd[1]: Created slice kubepods-burstable-podfbdd382cab87ee22500e41a42bbfbb23.slice - libcontainer container kubepods-burstable-podfbdd382cab87ee22500e41a42bbfbb23.slice. Jul 9 13:01:52.239668 kubelet[2560]: E0709 13:01:52.239322 2560 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jul 9 13:01:52.242151 systemd[1]: Created slice kubepods-burstable-podd1af03769b64da1b1e8089a7035018fc.slice - libcontainer container kubepods-burstable-podd1af03769b64da1b1e8089a7035018fc.slice. Jul 9 13:01:52.243415 kubelet[2560]: E0709 13:01:52.243274 2560 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jul 9 13:01:52.245142 systemd[1]: Created slice kubepods-burstable-pod8a75e163f27396b2168da0f88f85f8a5.slice - libcontainer container kubepods-burstable-pod8a75e163f27396b2168da0f88f85f8a5.slice. Jul 9 13:01:52.246586 kubelet[2560]: E0709 13:01:52.246467 2560 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jul 9 13:01:52.265530 kubelet[2560]: I0709 13:01:52.265510 2560 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jul 9 13:01:52.265866 kubelet[2560]: E0709 13:01:52.265846 2560 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.105:6443/api/v1/nodes\": dial tcp 139.178.70.105:6443: connect: connection refused" node="localhost" Jul 9 13:01:52.303464 kubelet[2560]: I0709 13:01:52.303288 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:52.303464 kubelet[2560]: I0709 13:01:52.303313 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fbdd382cab87ee22500e41a42bbfbb23-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"fbdd382cab87ee22500e41a42bbfbb23\") " pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:52.303464 kubelet[2560]: I0709 13:01:52.303324 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fbdd382cab87ee22500e41a42bbfbb23-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"fbdd382cab87ee22500e41a42bbfbb23\") " pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:52.303464 kubelet[2560]: I0709 13:01:52.303332 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fbdd382cab87ee22500e41a42bbfbb23-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"fbdd382cab87ee22500e41a42bbfbb23\") " pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:52.303464 kubelet[2560]: I0709 13:01:52.303344 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:52.303617 kubelet[2560]: I0709 13:01:52.303353 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8a75e163f27396b2168da0f88f85f8a5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"8a75e163f27396b2168da0f88f85f8a5\") " pod="kube-system/kube-scheduler-localhost" Jul 9 13:01:52.303617 kubelet[2560]: I0709 13:01:52.303364 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:52.303617 kubelet[2560]: I0709 13:01:52.303371 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:52.303617 kubelet[2560]: I0709 13:01:52.303380 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:52.311588 kubelet[2560]: E0709 13:01:52.311562 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.105:6443: connect: connection refused" interval="400ms" Jul 9 13:01:52.467374 kubelet[2560]: I0709 13:01:52.467330 2560 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jul 9 13:01:52.467635 kubelet[2560]: E0709 13:01:52.467622 2560 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.105:6443/api/v1/nodes\": dial tcp 139.178.70.105:6443: connect: connection refused" node="localhost" Jul 9 13:01:52.541583 containerd[1624]: time="2025-07-09T13:01:52.541542093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:fbdd382cab87ee22500e41a42bbfbb23,Namespace:kube-system,Attempt:0,}" Jul 9 13:01:52.554046 containerd[1624]: time="2025-07-09T13:01:52.553908732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d1af03769b64da1b1e8089a7035018fc,Namespace:kube-system,Attempt:0,}" Jul 9 13:01:52.557174 containerd[1624]: time="2025-07-09T13:01:52.557065356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:8a75e163f27396b2168da0f88f85f8a5,Namespace:kube-system,Attempt:0,}" Jul 9 13:01:52.712326 kubelet[2560]: E0709 13:01:52.712296 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.105:6443: connect: connection refused" interval="800ms" Jul 9 13:01:52.827989 containerd[1624]: time="2025-07-09T13:01:52.827779836Z" level=info msg="connecting to shim 6a42187955b2e2d8993071ac48ad40e376d9a93dc437e3419b69e6a803c7ae72" address="unix:///run/containerd/s/dacf97ffc3b0499934f05f74eba191653789b897c2c6898ce952e0d82420fa0a" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:01:52.828290 containerd[1624]: time="2025-07-09T13:01:52.828272433Z" level=info msg="connecting to shim d8ceafc4a084939463a909b2a79e67b927c93862216225656713c58b560775dc" address="unix:///run/containerd/s/fac82dd94405d77c217097d52b2192aa9dafe99798ead9a427f8de66f8939495" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:01:52.836512 containerd[1624]: time="2025-07-09T13:01:52.836486319Z" level=info msg="connecting to shim 2e0a111daaa0123ba352af5b221eedd7f1273daadb77433dabbe1e2c2ebf44f7" address="unix:///run/containerd/s/1d6d22890f93772ebcf62a442bf927d6d2bfd570dd583bc2d1e941fe49f8d197" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:01:52.870390 kubelet[2560]: I0709 13:01:52.869768 2560 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jul 9 13:01:52.870390 kubelet[2560]: E0709 13:01:52.869977 2560 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.105:6443/api/v1/nodes\": dial tcp 139.178.70.105:6443: connect: connection refused" node="localhost" Jul 9 13:01:52.934787 systemd[1]: Started cri-containerd-2e0a111daaa0123ba352af5b221eedd7f1273daadb77433dabbe1e2c2ebf44f7.scope - libcontainer container 2e0a111daaa0123ba352af5b221eedd7f1273daadb77433dabbe1e2c2ebf44f7. Jul 9 13:01:52.935917 systemd[1]: Started cri-containerd-6a42187955b2e2d8993071ac48ad40e376d9a93dc437e3419b69e6a803c7ae72.scope - libcontainer container 6a42187955b2e2d8993071ac48ad40e376d9a93dc437e3419b69e6a803c7ae72. Jul 9 13:01:52.938303 systemd[1]: Started cri-containerd-d8ceafc4a084939463a909b2a79e67b927c93862216225656713c58b560775dc.scope - libcontainer container d8ceafc4a084939463a909b2a79e67b927c93862216225656713c58b560775dc. Jul 9 13:01:52.966805 kubelet[2560]: W0709 13:01:52.966770 2560 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused Jul 9 13:01:52.966908 kubelet[2560]: E0709 13:01:52.966810 2560 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" Jul 9 13:01:52.997348 containerd[1624]: time="2025-07-09T13:01:52.997248129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:fbdd382cab87ee22500e41a42bbfbb23,Namespace:kube-system,Attempt:0,} returns sandbox id \"2e0a111daaa0123ba352af5b221eedd7f1273daadb77433dabbe1e2c2ebf44f7\"" Jul 9 13:01:52.997348 containerd[1624]: time="2025-07-09T13:01:52.997316430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d1af03769b64da1b1e8089a7035018fc,Namespace:kube-system,Attempt:0,} returns sandbox id \"d8ceafc4a084939463a909b2a79e67b927c93862216225656713c58b560775dc\"" Jul 9 13:01:53.001981 containerd[1624]: time="2025-07-09T13:01:53.001710433Z" level=info msg="CreateContainer within sandbox \"d8ceafc4a084939463a909b2a79e67b927c93862216225656713c58b560775dc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 9 13:01:53.002077 containerd[1624]: time="2025-07-09T13:01:53.002066509Z" level=info msg="CreateContainer within sandbox \"2e0a111daaa0123ba352af5b221eedd7f1273daadb77433dabbe1e2c2ebf44f7\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 9 13:01:53.006681 containerd[1624]: time="2025-07-09T13:01:53.006650731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:8a75e163f27396b2168da0f88f85f8a5,Namespace:kube-system,Attempt:0,} returns sandbox id \"6a42187955b2e2d8993071ac48ad40e376d9a93dc437e3419b69e6a803c7ae72\"" Jul 9 13:01:53.008130 containerd[1624]: time="2025-07-09T13:01:53.008119011Z" level=info msg="CreateContainer within sandbox \"6a42187955b2e2d8993071ac48ad40e376d9a93dc437e3419b69e6a803c7ae72\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 9 13:01:53.010817 containerd[1624]: time="2025-07-09T13:01:53.010795286Z" level=info msg="Container 16d2a436de1499c9278e1861ec419555ddf7bcb5c38e5f23c2e07bfaa773c75a: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:01:53.012373 containerd[1624]: time="2025-07-09T13:01:53.012314186Z" level=info msg="Container 3491005405c0f42871de13596bd9e16aaed478f608b02449cc93dede1a6b0a4d: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:01:53.013475 containerd[1624]: time="2025-07-09T13:01:53.013464235Z" level=info msg="Container 1872ba06e684a6c19bb5f4572349f0b8be45c2b0925ed29a1883e1fd4102b622: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:01:53.019348 containerd[1624]: time="2025-07-09T13:01:53.019336161Z" level=info msg="CreateContainer within sandbox \"6a42187955b2e2d8993071ac48ad40e376d9a93dc437e3419b69e6a803c7ae72\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1872ba06e684a6c19bb5f4572349f0b8be45c2b0925ed29a1883e1fd4102b622\"" Jul 9 13:01:53.020361 containerd[1624]: time="2025-07-09T13:01:53.020339251Z" level=info msg="StartContainer for \"1872ba06e684a6c19bb5f4572349f0b8be45c2b0925ed29a1883e1fd4102b622\"" Jul 9 13:01:53.020828 containerd[1624]: time="2025-07-09T13:01:53.020816652Z" level=info msg="CreateContainer within sandbox \"d8ceafc4a084939463a909b2a79e67b927c93862216225656713c58b560775dc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"3491005405c0f42871de13596bd9e16aaed478f608b02449cc93dede1a6b0a4d\"" Jul 9 13:01:53.021022 containerd[1624]: time="2025-07-09T13:01:53.021006786Z" level=info msg="connecting to shim 1872ba06e684a6c19bb5f4572349f0b8be45c2b0925ed29a1883e1fd4102b622" address="unix:///run/containerd/s/dacf97ffc3b0499934f05f74eba191653789b897c2c6898ce952e0d82420fa0a" protocol=ttrpc version=3 Jul 9 13:01:53.021430 containerd[1624]: time="2025-07-09T13:01:53.021416599Z" level=info msg="StartContainer for \"3491005405c0f42871de13596bd9e16aaed478f608b02449cc93dede1a6b0a4d\"" Jul 9 13:01:53.021913 containerd[1624]: time="2025-07-09T13:01:53.021901043Z" level=info msg="CreateContainer within sandbox \"2e0a111daaa0123ba352af5b221eedd7f1273daadb77433dabbe1e2c2ebf44f7\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"16d2a436de1499c9278e1861ec419555ddf7bcb5c38e5f23c2e07bfaa773c75a\"" Jul 9 13:01:53.021993 containerd[1624]: time="2025-07-09T13:01:53.021950542Z" level=info msg="connecting to shim 3491005405c0f42871de13596bd9e16aaed478f608b02449cc93dede1a6b0a4d" address="unix:///run/containerd/s/fac82dd94405d77c217097d52b2192aa9dafe99798ead9a427f8de66f8939495" protocol=ttrpc version=3 Jul 9 13:01:53.022330 containerd[1624]: time="2025-07-09T13:01:53.022274373Z" level=info msg="StartContainer for \"16d2a436de1499c9278e1861ec419555ddf7bcb5c38e5f23c2e07bfaa773c75a\"" Jul 9 13:01:53.023373 containerd[1624]: time="2025-07-09T13:01:53.023347549Z" level=info msg="connecting to shim 16d2a436de1499c9278e1861ec419555ddf7bcb5c38e5f23c2e07bfaa773c75a" address="unix:///run/containerd/s/1d6d22890f93772ebcf62a442bf927d6d2bfd570dd583bc2d1e941fe49f8d197" protocol=ttrpc version=3 Jul 9 13:01:53.039775 systemd[1]: Started cri-containerd-3491005405c0f42871de13596bd9e16aaed478f608b02449cc93dede1a6b0a4d.scope - libcontainer container 3491005405c0f42871de13596bd9e16aaed478f608b02449cc93dede1a6b0a4d. Jul 9 13:01:53.044234 systemd[1]: Started cri-containerd-16d2a436de1499c9278e1861ec419555ddf7bcb5c38e5f23c2e07bfaa773c75a.scope - libcontainer container 16d2a436de1499c9278e1861ec419555ddf7bcb5c38e5f23c2e07bfaa773c75a. Jul 9 13:01:53.045768 systemd[1]: Started cri-containerd-1872ba06e684a6c19bb5f4572349f0b8be45c2b0925ed29a1883e1fd4102b622.scope - libcontainer container 1872ba06e684a6c19bb5f4572349f0b8be45c2b0925ed29a1883e1fd4102b622. Jul 9 13:01:53.107559 containerd[1624]: time="2025-07-09T13:01:53.107228085Z" level=info msg="StartContainer for \"1872ba06e684a6c19bb5f4572349f0b8be45c2b0925ed29a1883e1fd4102b622\" returns successfully" Jul 9 13:01:53.108808 containerd[1624]: time="2025-07-09T13:01:53.108509041Z" level=info msg="StartContainer for \"16d2a436de1499c9278e1861ec419555ddf7bcb5c38e5f23c2e07bfaa773c75a\" returns successfully" Jul 9 13:01:53.120818 containerd[1624]: time="2025-07-09T13:01:53.120789316Z" level=info msg="StartContainer for \"3491005405c0f42871de13596bd9e16aaed478f608b02449cc93dede1a6b0a4d\" returns successfully" Jul 9 13:01:53.143458 kubelet[2560]: E0709 13:01:53.143230 2560 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jul 9 13:01:53.143727 kubelet[2560]: E0709 13:01:53.143717 2560 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jul 9 13:01:53.146124 kubelet[2560]: E0709 13:01:53.146112 2560 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jul 9 13:01:53.219060 kubelet[2560]: W0709 13:01:53.219011 2560 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused Jul 9 13:01:53.219060 kubelet[2560]: E0709 13:01:53.219059 2560 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" Jul 9 13:01:53.474344 kubelet[2560]: W0709 13:01:53.474307 2560 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused Jul 9 13:01:53.474446 kubelet[2560]: E0709 13:01:53.474350 2560 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" Jul 9 13:01:53.513119 kubelet[2560]: E0709 13:01:53.513094 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.105:6443: connect: connection refused" interval="1.6s" Jul 9 13:01:53.670924 kubelet[2560]: I0709 13:01:53.670907 2560 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jul 9 13:01:54.148867 kubelet[2560]: E0709 13:01:54.148845 2560 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jul 9 13:01:54.149354 kubelet[2560]: E0709 13:01:54.149342 2560 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jul 9 13:01:54.408005 kubelet[2560]: I0709 13:01:54.407529 2560 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Jul 9 13:01:54.500360 kubelet[2560]: I0709 13:01:54.500341 2560 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:54.504455 kubelet[2560]: E0709 13:01:54.504442 2560 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:54.504554 kubelet[2560]: I0709 13:01:54.504538 2560 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:54.505557 kubelet[2560]: E0709 13:01:54.505543 2560 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:54.505557 kubelet[2560]: I0709 13:01:54.505556 2560 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Jul 9 13:01:54.506442 kubelet[2560]: E0709 13:01:54.506415 2560 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Jul 9 13:01:55.087925 kubelet[2560]: I0709 13:01:55.087777 2560 apiserver.go:52] "Watching apiserver" Jul 9 13:01:55.102380 kubelet[2560]: I0709 13:01:55.102357 2560 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 9 13:01:55.147608 kubelet[2560]: I0709 13:01:55.147585 2560 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:55.149368 kubelet[2560]: E0709 13:01:55.149356 2560 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:55.622001 kubelet[2560]: I0709 13:01:55.621878 2560 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:56.288589 systemd[1]: Reload requested from client PID 2831 ('systemctl') (unit session-9.scope)... Jul 9 13:01:56.288602 systemd[1]: Reloading... Jul 9 13:01:56.350684 zram_generator::config[2884]: No configuration found. Jul 9 13:01:56.423880 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 9 13:01:56.432473 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 9 13:01:56.509609 systemd[1]: Reloading finished in 220 ms. Jul 9 13:01:56.538127 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 13:01:56.554880 systemd[1]: kubelet.service: Deactivated successfully. Jul 9 13:01:56.555029 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 13:01:56.555066 systemd[1]: kubelet.service: Consumed 504ms CPU time, 128.8M memory peak. Jul 9 13:01:56.556510 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 13:01:56.716316 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 13:01:56.724959 (kubelet)[2942]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 9 13:01:56.770481 kubelet[2942]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 9 13:01:56.770481 kubelet[2942]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 9 13:01:56.770481 kubelet[2942]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 9 13:01:56.770481 kubelet[2942]: I0709 13:01:56.769013 2942 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 9 13:01:56.781367 kubelet[2942]: I0709 13:01:56.781347 2942 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jul 9 13:01:56.781454 kubelet[2942]: I0709 13:01:56.781448 2942 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 9 13:01:56.782059 kubelet[2942]: I0709 13:01:56.782051 2942 server.go:954] "Client rotation is on, will bootstrap in background" Jul 9 13:01:56.790618 kubelet[2942]: I0709 13:01:56.790588 2942 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 9 13:01:56.875193 kubelet[2942]: I0709 13:01:56.875120 2942 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 9 13:01:56.877975 kubelet[2942]: I0709 13:01:56.877965 2942 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 9 13:01:56.880545 kubelet[2942]: I0709 13:01:56.880536 2942 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 9 13:01:56.880759 kubelet[2942]: I0709 13:01:56.880745 2942 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 9 13:01:56.880915 kubelet[2942]: I0709 13:01:56.880802 2942 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 9 13:01:56.881013 kubelet[2942]: I0709 13:01:56.881005 2942 topology_manager.go:138] "Creating topology manager with none policy" Jul 9 13:01:56.881053 kubelet[2942]: I0709 13:01:56.881048 2942 container_manager_linux.go:304] "Creating device plugin manager" Jul 9 13:01:56.881115 kubelet[2942]: I0709 13:01:56.881109 2942 state_mem.go:36] "Initialized new in-memory state store" Jul 9 13:01:56.881270 kubelet[2942]: I0709 13:01:56.881264 2942 kubelet.go:446] "Attempting to sync node with API server" Jul 9 13:01:56.881642 kubelet[2942]: I0709 13:01:56.881633 2942 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 9 13:01:56.881795 kubelet[2942]: I0709 13:01:56.881743 2942 kubelet.go:352] "Adding apiserver pod source" Jul 9 13:01:56.881845 kubelet[2942]: I0709 13:01:56.881835 2942 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 9 13:01:56.890369 kubelet[2942]: I0709 13:01:56.890351 2942 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 9 13:01:56.891070 kubelet[2942]: I0709 13:01:56.890630 2942 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 9 13:01:56.891722 kubelet[2942]: I0709 13:01:56.891711 2942 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 9 13:01:56.891767 kubelet[2942]: I0709 13:01:56.891730 2942 server.go:1287] "Started kubelet" Jul 9 13:01:56.892947 kubelet[2942]: I0709 13:01:56.892935 2942 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 9 13:01:56.897792 kubelet[2942]: I0709 13:01:56.897774 2942 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 9 13:01:56.898051 kubelet[2942]: I0709 13:01:56.898035 2942 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jul 9 13:01:56.899088 kubelet[2942]: I0709 13:01:56.899078 2942 server.go:479] "Adding debug handlers to kubelet server" Jul 9 13:01:56.899200 kubelet[2942]: I0709 13:01:56.899188 2942 reconciler.go:26] "Reconciler: start to sync state" Jul 9 13:01:56.899643 kubelet[2942]: I0709 13:01:56.899124 2942 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 9 13:01:56.899741 kubelet[2942]: I0709 13:01:56.899716 2942 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 9 13:01:56.899880 kubelet[2942]: I0709 13:01:56.899872 2942 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 9 13:01:56.900020 kubelet[2942]: I0709 13:01:56.900012 2942 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 9 13:01:56.900595 kubelet[2942]: I0709 13:01:56.900208 2942 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 9 13:01:56.900828 kubelet[2942]: I0709 13:01:56.900816 2942 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 9 13:01:56.900861 kubelet[2942]: I0709 13:01:56.900832 2942 status_manager.go:227] "Starting to sync pod status with apiserver" Jul 9 13:01:56.900861 kubelet[2942]: I0709 13:01:56.900842 2942 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 9 13:01:56.900861 kubelet[2942]: I0709 13:01:56.900846 2942 kubelet.go:2382] "Starting kubelet main sync loop" Jul 9 13:01:56.900926 kubelet[2942]: E0709 13:01:56.900867 2942 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 9 13:01:56.904944 kubelet[2942]: E0709 13:01:56.904928 2942 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 9 13:01:56.905381 kubelet[2942]: I0709 13:01:56.905368 2942 factory.go:221] Registration of the containerd container factory successfully Jul 9 13:01:56.905381 kubelet[2942]: I0709 13:01:56.905377 2942 factory.go:221] Registration of the systemd container factory successfully Jul 9 13:01:56.905446 kubelet[2942]: I0709 13:01:56.905421 2942 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 9 13:01:56.945658 kubelet[2942]: I0709 13:01:56.945645 2942 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 9 13:01:56.945751 kubelet[2942]: I0709 13:01:56.945744 2942 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 9 13:01:56.945832 kubelet[2942]: I0709 13:01:56.945827 2942 state_mem.go:36] "Initialized new in-memory state store" Jul 9 13:01:56.946091 kubelet[2942]: I0709 13:01:56.945974 2942 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 9 13:01:56.946091 kubelet[2942]: I0709 13:01:56.945982 2942 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 9 13:01:56.946091 kubelet[2942]: I0709 13:01:56.945993 2942 policy_none.go:49] "None policy: Start" Jul 9 13:01:56.946091 kubelet[2942]: I0709 13:01:56.945997 2942 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 9 13:01:56.946091 kubelet[2942]: I0709 13:01:56.946003 2942 state_mem.go:35] "Initializing new in-memory state store" Jul 9 13:01:56.946091 kubelet[2942]: I0709 13:01:56.946059 2942 state_mem.go:75] "Updated machine memory state" Jul 9 13:01:56.948507 kubelet[2942]: I0709 13:01:56.948491 2942 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 9 13:01:56.949226 kubelet[2942]: I0709 13:01:56.949219 2942 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 9 13:01:56.949459 kubelet[2942]: I0709 13:01:56.949440 2942 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 9 13:01:56.949698 kubelet[2942]: I0709 13:01:56.949586 2942 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 9 13:01:56.951684 kubelet[2942]: E0709 13:01:56.950596 2942 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 9 13:01:57.002145 kubelet[2942]: I0709 13:01:57.002118 2942 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.004016 kubelet[2942]: I0709 13:01:57.003800 2942 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Jul 9 13:01:57.004354 kubelet[2942]: I0709 13:01:57.004339 2942 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:57.005877 kubelet[2942]: E0709 13:01:57.005851 2942 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.053985 kubelet[2942]: I0709 13:01:57.053964 2942 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jul 9 13:01:57.057186 kubelet[2942]: I0709 13:01:57.057165 2942 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Jul 9 13:01:57.057267 kubelet[2942]: I0709 13:01:57.057221 2942 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Jul 9 13:01:57.200467 kubelet[2942]: I0709 13:01:57.200391 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8a75e163f27396b2168da0f88f85f8a5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"8a75e163f27396b2168da0f88f85f8a5\") " pod="kube-system/kube-scheduler-localhost" Jul 9 13:01:57.200467 kubelet[2942]: I0709 13:01:57.200420 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fbdd382cab87ee22500e41a42bbfbb23-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"fbdd382cab87ee22500e41a42bbfbb23\") " pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:57.200467 kubelet[2942]: I0709 13:01:57.200432 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.200467 kubelet[2942]: I0709 13:01:57.200444 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.200467 kubelet[2942]: I0709 13:01:57.200456 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.200615 kubelet[2942]: I0709 13:01:57.200471 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fbdd382cab87ee22500e41a42bbfbb23-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"fbdd382cab87ee22500e41a42bbfbb23\") " pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:57.200615 kubelet[2942]: I0709 13:01:57.200480 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fbdd382cab87ee22500e41a42bbfbb23-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"fbdd382cab87ee22500e41a42bbfbb23\") " pod="kube-system/kube-apiserver-localhost" Jul 9 13:01:57.200615 kubelet[2942]: I0709 13:01:57.200498 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.200615 kubelet[2942]: I0709 13:01:57.200510 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d1af03769b64da1b1e8089a7035018fc-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d1af03769b64da1b1e8089a7035018fc\") " pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.883330 kubelet[2942]: I0709 13:01:57.883162 2942 apiserver.go:52] "Watching apiserver" Jul 9 13:01:57.900215 kubelet[2942]: I0709 13:01:57.900175 2942 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 9 13:01:57.932442 kubelet[2942]: I0709 13:01:57.932418 2942 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.935191 kubelet[2942]: I0709 13:01:57.933215 2942 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Jul 9 13:01:57.952499 kubelet[2942]: E0709 13:01:57.952455 2942 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Jul 9 13:01:57.959162 kubelet[2942]: I0709 13:01:57.958887 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=0.958866684 podStartE2EDuration="958.866684ms" podCreationTimestamp="2025-07-09 13:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 13:01:57.958765678 +0000 UTC m=+1.222531195" watchObservedRunningTime="2025-07-09 13:01:57.958866684 +0000 UTC m=+1.222632197" Jul 9 13:01:57.959162 kubelet[2942]: E0709 13:01:57.958993 2942 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Jul 9 13:01:57.981450 kubelet[2942]: I0709 13:01:57.981390 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=2.981373311 podStartE2EDuration="2.981373311s" podCreationTimestamp="2025-07-09 13:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 13:01:57.973736673 +0000 UTC m=+1.237502192" watchObservedRunningTime="2025-07-09 13:01:57.981373311 +0000 UTC m=+1.245138821" Jul 9 13:01:57.981655 kubelet[2942]: I0709 13:01:57.981482 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=0.981476747 podStartE2EDuration="981.476747ms" podCreationTimestamp="2025-07-09 13:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 13:01:57.981082486 +0000 UTC m=+1.244848005" watchObservedRunningTime="2025-07-09 13:01:57.981476747 +0000 UTC m=+1.245242267" Jul 9 13:02:02.969155 kubelet[2942]: I0709 13:02:02.969133 2942 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 9 13:02:02.971368 containerd[1624]: time="2025-07-09T13:02:02.971046900Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 9 13:02:02.975441 kubelet[2942]: I0709 13:02:02.971829 2942 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 9 13:02:03.925967 systemd[1]: Created slice kubepods-besteffort-podbc5e7fc5_bedc_4121_b42b_e153eaf970cb.slice - libcontainer container kubepods-besteffort-podbc5e7fc5_bedc_4121_b42b_e153eaf970cb.slice. Jul 9 13:02:03.947555 kubelet[2942]: I0709 13:02:03.947527 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/bc5e7fc5-bedc-4121-b42b-e153eaf970cb-xtables-lock\") pod \"kube-proxy-5ssh4\" (UID: \"bc5e7fc5-bedc-4121-b42b-e153eaf970cb\") " pod="kube-system/kube-proxy-5ssh4" Jul 9 13:02:03.947555 kubelet[2942]: I0709 13:02:03.947558 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc5e7fc5-bedc-4121-b42b-e153eaf970cb-lib-modules\") pod \"kube-proxy-5ssh4\" (UID: \"bc5e7fc5-bedc-4121-b42b-e153eaf970cb\") " pod="kube-system/kube-proxy-5ssh4" Jul 9 13:02:03.947700 kubelet[2942]: I0709 13:02:03.947574 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/bc5e7fc5-bedc-4121-b42b-e153eaf970cb-kube-proxy\") pod \"kube-proxy-5ssh4\" (UID: \"bc5e7fc5-bedc-4121-b42b-e153eaf970cb\") " pod="kube-system/kube-proxy-5ssh4" Jul 9 13:02:03.947700 kubelet[2942]: I0709 13:02:03.947589 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpzp4\" (UniqueName: \"kubernetes.io/projected/bc5e7fc5-bedc-4121-b42b-e153eaf970cb-kube-api-access-wpzp4\") pod \"kube-proxy-5ssh4\" (UID: \"bc5e7fc5-bedc-4121-b42b-e153eaf970cb\") " pod="kube-system/kube-proxy-5ssh4" Jul 9 13:02:04.145392 kubelet[2942]: I0709 13:02:04.145362 2942 status_manager.go:890] "Failed to get status for pod" podUID="18b36dc4-44d5-410a-a318-767bf7257943" pod="tigera-operator/tigera-operator-747864d56d-s7xkn" err="pods \"tigera-operator-747864d56d-s7xkn\" is forbidden: User \"system:node:localhost\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'localhost' and this object" Jul 9 13:02:04.145792 systemd[1]: Created slice kubepods-besteffort-pod18b36dc4_44d5_410a_a318_767bf7257943.slice - libcontainer container kubepods-besteffort-pod18b36dc4_44d5_410a_a318_767bf7257943.slice. Jul 9 13:02:04.148281 kubelet[2942]: I0709 13:02:04.148259 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/18b36dc4-44d5-410a-a318-767bf7257943-var-lib-calico\") pod \"tigera-operator-747864d56d-s7xkn\" (UID: \"18b36dc4-44d5-410a-a318-767bf7257943\") " pod="tigera-operator/tigera-operator-747864d56d-s7xkn" Jul 9 13:02:04.148281 kubelet[2942]: I0709 13:02:04.148280 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bck9t\" (UniqueName: \"kubernetes.io/projected/18b36dc4-44d5-410a-a318-767bf7257943-kube-api-access-bck9t\") pod \"tigera-operator-747864d56d-s7xkn\" (UID: \"18b36dc4-44d5-410a-a318-767bf7257943\") " pod="tigera-operator/tigera-operator-747864d56d-s7xkn" Jul 9 13:02:04.233370 containerd[1624]: time="2025-07-09T13:02:04.233317777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5ssh4,Uid:bc5e7fc5-bedc-4121-b42b-e153eaf970cb,Namespace:kube-system,Attempt:0,}" Jul 9 13:02:04.245298 containerd[1624]: time="2025-07-09T13:02:04.245267712Z" level=info msg="connecting to shim ecd78e9641fa9d713e0577c6c7f7c3af84b418a4327c04c6c2adf06e2344142d" address="unix:///run/containerd/s/e38dd803c219830757213821b55f0956c307801d75de34fdfffbf7981ae4a61b" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:04.272838 systemd[1]: Started cri-containerd-ecd78e9641fa9d713e0577c6c7f7c3af84b418a4327c04c6c2adf06e2344142d.scope - libcontainer container ecd78e9641fa9d713e0577c6c7f7c3af84b418a4327c04c6c2adf06e2344142d. Jul 9 13:02:04.291789 containerd[1624]: time="2025-07-09T13:02:04.291760733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5ssh4,Uid:bc5e7fc5-bedc-4121-b42b-e153eaf970cb,Namespace:kube-system,Attempt:0,} returns sandbox id \"ecd78e9641fa9d713e0577c6c7f7c3af84b418a4327c04c6c2adf06e2344142d\"" Jul 9 13:02:04.294819 containerd[1624]: time="2025-07-09T13:02:04.294791622Z" level=info msg="CreateContainer within sandbox \"ecd78e9641fa9d713e0577c6c7f7c3af84b418a4327c04c6c2adf06e2344142d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 9 13:02:04.304340 containerd[1624]: time="2025-07-09T13:02:04.304308788Z" level=info msg="Container 0b4a6d4294a4492efc644b5b785e106800e00f48034d9ac2725191cb767e8085: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:04.308355 containerd[1624]: time="2025-07-09T13:02:04.308323382Z" level=info msg="CreateContainer within sandbox \"ecd78e9641fa9d713e0577c6c7f7c3af84b418a4327c04c6c2adf06e2344142d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0b4a6d4294a4492efc644b5b785e106800e00f48034d9ac2725191cb767e8085\"" Jul 9 13:02:04.308903 containerd[1624]: time="2025-07-09T13:02:04.308794398Z" level=info msg="StartContainer for \"0b4a6d4294a4492efc644b5b785e106800e00f48034d9ac2725191cb767e8085\"" Jul 9 13:02:04.311423 containerd[1624]: time="2025-07-09T13:02:04.311389279Z" level=info msg="connecting to shim 0b4a6d4294a4492efc644b5b785e106800e00f48034d9ac2725191cb767e8085" address="unix:///run/containerd/s/e38dd803c219830757213821b55f0956c307801d75de34fdfffbf7981ae4a61b" protocol=ttrpc version=3 Jul 9 13:02:04.327828 systemd[1]: Started cri-containerd-0b4a6d4294a4492efc644b5b785e106800e00f48034d9ac2725191cb767e8085.scope - libcontainer container 0b4a6d4294a4492efc644b5b785e106800e00f48034d9ac2725191cb767e8085. Jul 9 13:02:04.356990 containerd[1624]: time="2025-07-09T13:02:04.356953925Z" level=info msg="StartContainer for \"0b4a6d4294a4492efc644b5b785e106800e00f48034d9ac2725191cb767e8085\" returns successfully" Jul 9 13:02:04.448212 containerd[1624]: time="2025-07-09T13:02:04.448163908Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-s7xkn,Uid:18b36dc4-44d5-410a-a318-767bf7257943,Namespace:tigera-operator,Attempt:0,}" Jul 9 13:02:04.459003 containerd[1624]: time="2025-07-09T13:02:04.458938796Z" level=info msg="connecting to shim efb813c7e769503b455ecd48fde6e45f6d4c4a442953b29d0504bd303cb44075" address="unix:///run/containerd/s/f65c698b0925cb97e305eb418b279ebc770717eeeee389b64c7268c7e473fead" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:04.481763 systemd[1]: Started cri-containerd-efb813c7e769503b455ecd48fde6e45f6d4c4a442953b29d0504bd303cb44075.scope - libcontainer container efb813c7e769503b455ecd48fde6e45f6d4c4a442953b29d0504bd303cb44075. Jul 9 13:02:04.520653 containerd[1624]: time="2025-07-09T13:02:04.520588305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-s7xkn,Uid:18b36dc4-44d5-410a-a318-767bf7257943,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"efb813c7e769503b455ecd48fde6e45f6d4c4a442953b29d0504bd303cb44075\"" Jul 9 13:02:04.521770 containerd[1624]: time="2025-07-09T13:02:04.521726195Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 9 13:02:04.951878 kubelet[2942]: I0709 13:02:04.951368 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5ssh4" podStartSLOduration=1.951264393 podStartE2EDuration="1.951264393s" podCreationTimestamp="2025-07-09 13:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 13:02:04.951175363 +0000 UTC m=+8.214940884" watchObservedRunningTime="2025-07-09 13:02:04.951264393 +0000 UTC m=+8.215029905" Jul 9 13:02:05.056859 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1700038389.mount: Deactivated successfully. Jul 9 13:02:06.059457 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount933150596.mount: Deactivated successfully. Jul 9 13:02:06.817217 containerd[1624]: time="2025-07-09T13:02:06.816830615Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:06.824304 containerd[1624]: time="2025-07-09T13:02:06.824291179Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Jul 9 13:02:06.842262 containerd[1624]: time="2025-07-09T13:02:06.842242221Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:06.871370 containerd[1624]: time="2025-07-09T13:02:06.871276159Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:06.872013 containerd[1624]: time="2025-07-09T13:02:06.871852638Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.35010825s" Jul 9 13:02:06.872013 containerd[1624]: time="2025-07-09T13:02:06.871869491Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Jul 9 13:02:06.873943 containerd[1624]: time="2025-07-09T13:02:06.873747306Z" level=info msg="CreateContainer within sandbox \"efb813c7e769503b455ecd48fde6e45f6d4c4a442953b29d0504bd303cb44075\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 9 13:02:07.061044 containerd[1624]: time="2025-07-09T13:02:07.060712500Z" level=info msg="Container 8f34ae67e06e9690d8f72db5732c9eb224f41843de397f3c9734451122e565e6: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:07.089274 containerd[1624]: time="2025-07-09T13:02:07.089160952Z" level=info msg="CreateContainer within sandbox \"efb813c7e769503b455ecd48fde6e45f6d4c4a442953b29d0504bd303cb44075\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8f34ae67e06e9690d8f72db5732c9eb224f41843de397f3c9734451122e565e6\"" Jul 9 13:02:07.090701 containerd[1624]: time="2025-07-09T13:02:07.090483803Z" level=info msg="StartContainer for \"8f34ae67e06e9690d8f72db5732c9eb224f41843de397f3c9734451122e565e6\"" Jul 9 13:02:07.091495 containerd[1624]: time="2025-07-09T13:02:07.091474771Z" level=info msg="connecting to shim 8f34ae67e06e9690d8f72db5732c9eb224f41843de397f3c9734451122e565e6" address="unix:///run/containerd/s/f65c698b0925cb97e305eb418b279ebc770717eeeee389b64c7268c7e473fead" protocol=ttrpc version=3 Jul 9 13:02:07.120810 systemd[1]: Started cri-containerd-8f34ae67e06e9690d8f72db5732c9eb224f41843de397f3c9734451122e565e6.scope - libcontainer container 8f34ae67e06e9690d8f72db5732c9eb224f41843de397f3c9734451122e565e6. Jul 9 13:02:07.148314 containerd[1624]: time="2025-07-09T13:02:07.148279880Z" level=info msg="StartContainer for \"8f34ae67e06e9690d8f72db5732c9eb224f41843de397f3c9734451122e565e6\" returns successfully" Jul 9 13:02:10.348505 kubelet[2942]: I0709 13:02:10.348435 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-s7xkn" podStartSLOduration=3.99744289 podStartE2EDuration="6.348413295s" podCreationTimestamp="2025-07-09 13:02:04 +0000 UTC" firstStartedPulling="2025-07-09 13:02:04.521377424 +0000 UTC m=+7.785142935" lastFinishedPulling="2025-07-09 13:02:06.872347833 +0000 UTC m=+10.136113340" observedRunningTime="2025-07-09 13:02:07.968470976 +0000 UTC m=+11.232236504" watchObservedRunningTime="2025-07-09 13:02:10.348413295 +0000 UTC m=+13.612178809" Jul 9 13:02:14.118762 sudo[1965]: pam_unix(sudo:session): session closed for user root Jul 9 13:02:14.122944 sshd[1964]: Connection closed by 139.178.68.195 port 34832 Jul 9 13:02:14.128544 sshd-session[1961]: pam_unix(sshd:session): session closed for user core Jul 9 13:02:14.132995 systemd[1]: sshd@6-139.178.70.105:22-139.178.68.195:34832.service: Deactivated successfully. Jul 9 13:02:14.133023 systemd-logind[1599]: Session 9 logged out. Waiting for processes to exit. Jul 9 13:02:14.134190 systemd[1]: session-9.scope: Deactivated successfully. Jul 9 13:02:14.135699 systemd[1]: session-9.scope: Consumed 2.842s CPU time, 152.5M memory peak. Jul 9 13:02:14.138193 systemd-logind[1599]: Removed session 9. Jul 9 13:02:17.839480 kubelet[2942]: I0709 13:02:17.839386 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/876559a4-4a63-402d-8cbd-8341c34e2f5a-tigera-ca-bundle\") pod \"calico-typha-578d79bf54-zdvbv\" (UID: \"876559a4-4a63-402d-8cbd-8341c34e2f5a\") " pod="calico-system/calico-typha-578d79bf54-zdvbv" Jul 9 13:02:17.839480 kubelet[2942]: I0709 13:02:17.839430 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/876559a4-4a63-402d-8cbd-8341c34e2f5a-typha-certs\") pod \"calico-typha-578d79bf54-zdvbv\" (UID: \"876559a4-4a63-402d-8cbd-8341c34e2f5a\") " pod="calico-system/calico-typha-578d79bf54-zdvbv" Jul 9 13:02:17.839480 kubelet[2942]: I0709 13:02:17.839441 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dthl2\" (UniqueName: \"kubernetes.io/projected/876559a4-4a63-402d-8cbd-8341c34e2f5a-kube-api-access-dthl2\") pod \"calico-typha-578d79bf54-zdvbv\" (UID: \"876559a4-4a63-402d-8cbd-8341c34e2f5a\") " pod="calico-system/calico-typha-578d79bf54-zdvbv" Jul 9 13:02:17.844284 systemd[1]: Created slice kubepods-besteffort-pod876559a4_4a63_402d_8cbd_8341c34e2f5a.slice - libcontainer container kubepods-besteffort-pod876559a4_4a63_402d_8cbd_8341c34e2f5a.slice. Jul 9 13:02:18.153424 containerd[1624]: time="2025-07-09T13:02:18.153139348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-578d79bf54-zdvbv,Uid:876559a4-4a63-402d-8cbd-8341c34e2f5a,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:18.168276 containerd[1624]: time="2025-07-09T13:02:18.167949847Z" level=info msg="connecting to shim 15152d91e4140fd5db1c79d647e587e3b2d38b396ce6842cbd9d6d43325c145e" address="unix:///run/containerd/s/b0fcd0eb53968c04170f604e3886dc858505fc519a608290401deb579ea4273a" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:18.190876 systemd[1]: Started cri-containerd-15152d91e4140fd5db1c79d647e587e3b2d38b396ce6842cbd9d6d43325c145e.scope - libcontainer container 15152d91e4140fd5db1c79d647e587e3b2d38b396ce6842cbd9d6d43325c145e. Jul 9 13:02:18.224540 systemd[1]: Created slice kubepods-besteffort-podaac77ff3_f3ff_42e8_a22e_2984fd05cd5b.slice - libcontainer container kubepods-besteffort-podaac77ff3_f3ff_42e8_a22e_2984fd05cd5b.slice. Jul 9 13:02:18.242197 kubelet[2942]: I0709 13:02:18.241613 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-var-run-calico\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242197 kubelet[2942]: I0709 13:02:18.241633 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-node-certs\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242197 kubelet[2942]: I0709 13:02:18.241643 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-xtables-lock\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242197 kubelet[2942]: I0709 13:02:18.241653 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sftn7\" (UniqueName: \"kubernetes.io/projected/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-kube-api-access-sftn7\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242197 kubelet[2942]: I0709 13:02:18.241664 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-cni-log-dir\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242347 kubelet[2942]: I0709 13:02:18.241688 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-lib-modules\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242347 kubelet[2942]: I0709 13:02:18.241698 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-policysync\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242347 kubelet[2942]: I0709 13:02:18.241706 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-var-lib-calico\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242347 kubelet[2942]: I0709 13:02:18.241715 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-cni-bin-dir\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242347 kubelet[2942]: I0709 13:02:18.241724 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-tigera-ca-bundle\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242459 kubelet[2942]: I0709 13:02:18.241736 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-cni-net-dir\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.242459 kubelet[2942]: I0709 13:02:18.241746 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/aac77ff3-f3ff-42e8-a22e-2984fd05cd5b-flexvol-driver-host\") pod \"calico-node-8gbzx\" (UID: \"aac77ff3-f3ff-42e8-a22e-2984fd05cd5b\") " pod="calico-system/calico-node-8gbzx" Jul 9 13:02:18.251719 containerd[1624]: time="2025-07-09T13:02:18.251659760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-578d79bf54-zdvbv,Uid:876559a4-4a63-402d-8cbd-8341c34e2f5a,Namespace:calico-system,Attempt:0,} returns sandbox id \"15152d91e4140fd5db1c79d647e587e3b2d38b396ce6842cbd9d6d43325c145e\"" Jul 9 13:02:18.252654 containerd[1624]: time="2025-07-09T13:02:18.252635563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 9 13:02:18.353536 kubelet[2942]: E0709 13:02:18.353514 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.353536 kubelet[2942]: W0709 13:02:18.353527 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.354780 kubelet[2942]: E0709 13:02:18.354652 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.516403 kubelet[2942]: E0709 13:02:18.516298 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:18.532931 containerd[1624]: time="2025-07-09T13:02:18.532817919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8gbzx,Uid:aac77ff3-f3ff-42e8-a22e-2984fd05cd5b,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:18.544872 kubelet[2942]: E0709 13:02:18.544842 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.544872 kubelet[2942]: W0709 13:02:18.544857 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.544872 kubelet[2942]: E0709 13:02:18.544869 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.545013 kubelet[2942]: E0709 13:02:18.544967 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.545013 kubelet[2942]: W0709 13:02:18.544972 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.545013 kubelet[2942]: E0709 13:02:18.544977 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.545193 kubelet[2942]: E0709 13:02:18.545183 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.545193 kubelet[2942]: W0709 13:02:18.545192 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.545237 kubelet[2942]: E0709 13:02:18.545198 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.545315 kubelet[2942]: E0709 13:02:18.545304 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.545315 kubelet[2942]: W0709 13:02:18.545313 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.545383 kubelet[2942]: E0709 13:02:18.545318 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.545403 kubelet[2942]: E0709 13:02:18.545396 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.545403 kubelet[2942]: W0709 13:02:18.545400 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.545715 kubelet[2942]: E0709 13:02:18.545406 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.545715 kubelet[2942]: E0709 13:02:18.545501 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.545715 kubelet[2942]: W0709 13:02:18.545507 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.545715 kubelet[2942]: E0709 13:02:18.545511 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.545872 kubelet[2942]: E0709 13:02:18.545742 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.545872 kubelet[2942]: W0709 13:02:18.545747 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.545872 kubelet[2942]: E0709 13:02:18.545753 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.545872 kubelet[2942]: E0709 13:02:18.545832 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.545872 kubelet[2942]: W0709 13:02:18.545836 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.545872 kubelet[2942]: E0709 13:02:18.545841 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.546992 kubelet[2942]: E0709 13:02:18.545936 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.546992 kubelet[2942]: W0709 13:02:18.545940 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.546992 kubelet[2942]: E0709 13:02:18.545946 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.546992 kubelet[2942]: E0709 13:02:18.546035 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.546992 kubelet[2942]: W0709 13:02:18.546040 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.546992 kubelet[2942]: E0709 13:02:18.546045 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.546992 kubelet[2942]: E0709 13:02:18.546223 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.546992 kubelet[2942]: W0709 13:02:18.546228 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.546992 kubelet[2942]: E0709 13:02:18.546233 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.546992 kubelet[2942]: E0709 13:02:18.546328 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547179 kubelet[2942]: W0709 13:02:18.546333 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547179 kubelet[2942]: E0709 13:02:18.546337 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547179 kubelet[2942]: E0709 13:02:18.546434 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547179 kubelet[2942]: W0709 13:02:18.546455 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547179 kubelet[2942]: E0709 13:02:18.546462 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547179 kubelet[2942]: E0709 13:02:18.546548 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547179 kubelet[2942]: W0709 13:02:18.546552 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547179 kubelet[2942]: E0709 13:02:18.546557 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547179 kubelet[2942]: E0709 13:02:18.546627 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547179 kubelet[2942]: W0709 13:02:18.546631 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547338 kubelet[2942]: E0709 13:02:18.546635 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547338 kubelet[2942]: E0709 13:02:18.546722 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547338 kubelet[2942]: W0709 13:02:18.546727 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547338 kubelet[2942]: E0709 13:02:18.546732 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547338 kubelet[2942]: E0709 13:02:18.546820 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547338 kubelet[2942]: W0709 13:02:18.546824 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547338 kubelet[2942]: E0709 13:02:18.546828 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547338 kubelet[2942]: E0709 13:02:18.546908 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547338 kubelet[2942]: W0709 13:02:18.546912 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547338 kubelet[2942]: E0709 13:02:18.546933 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547505 kubelet[2942]: E0709 13:02:18.547007 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547505 kubelet[2942]: W0709 13:02:18.547012 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547505 kubelet[2942]: E0709 13:02:18.547017 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547505 kubelet[2942]: E0709 13:02:18.547090 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547505 kubelet[2942]: W0709 13:02:18.547096 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547505 kubelet[2942]: E0709 13:02:18.547189 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547505 kubelet[2942]: E0709 13:02:18.547319 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547505 kubelet[2942]: W0709 13:02:18.547323 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547505 kubelet[2942]: E0709 13:02:18.547328 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547865 kubelet[2942]: I0709 13:02:18.547352 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4941ee09-024b-44fc-8ad7-71d776271987-kubelet-dir\") pod \"csi-node-driver-nhvjr\" (UID: \"4941ee09-024b-44fc-8ad7-71d776271987\") " pod="calico-system/csi-node-driver-nhvjr" Jul 9 13:02:18.547865 kubelet[2942]: E0709 13:02:18.547527 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547865 kubelet[2942]: W0709 13:02:18.547533 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547865 kubelet[2942]: E0709 13:02:18.547546 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.547865 kubelet[2942]: I0709 13:02:18.547556 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4941ee09-024b-44fc-8ad7-71d776271987-socket-dir\") pod \"csi-node-driver-nhvjr\" (UID: \"4941ee09-024b-44fc-8ad7-71d776271987\") " pod="calico-system/csi-node-driver-nhvjr" Jul 9 13:02:18.547865 kubelet[2942]: E0709 13:02:18.547634 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.547865 kubelet[2942]: W0709 13:02:18.547641 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.547865 kubelet[2942]: E0709 13:02:18.547647 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551342 kubelet[2942]: I0709 13:02:18.547655 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4941ee09-024b-44fc-8ad7-71d776271987-varrun\") pod \"csi-node-driver-nhvjr\" (UID: \"4941ee09-024b-44fc-8ad7-71d776271987\") " pod="calico-system/csi-node-driver-nhvjr" Jul 9 13:02:18.551342 kubelet[2942]: E0709 13:02:18.547827 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551342 kubelet[2942]: W0709 13:02:18.547833 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551342 kubelet[2942]: E0709 13:02:18.547846 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551342 kubelet[2942]: I0709 13:02:18.547856 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsd2w\" (UniqueName: \"kubernetes.io/projected/4941ee09-024b-44fc-8ad7-71d776271987-kube-api-access-fsd2w\") pod \"csi-node-driver-nhvjr\" (UID: \"4941ee09-024b-44fc-8ad7-71d776271987\") " pod="calico-system/csi-node-driver-nhvjr" Jul 9 13:02:18.551342 kubelet[2942]: E0709 13:02:18.548005 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551342 kubelet[2942]: W0709 13:02:18.548011 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551342 kubelet[2942]: E0709 13:02:18.548017 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551471 kubelet[2942]: I0709 13:02:18.548028 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4941ee09-024b-44fc-8ad7-71d776271987-registration-dir\") pod \"csi-node-driver-nhvjr\" (UID: \"4941ee09-024b-44fc-8ad7-71d776271987\") " pod="calico-system/csi-node-driver-nhvjr" Jul 9 13:02:18.551471 kubelet[2942]: E0709 13:02:18.548150 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551471 kubelet[2942]: W0709 13:02:18.548155 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551471 kubelet[2942]: E0709 13:02:18.548160 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551471 kubelet[2942]: E0709 13:02:18.548314 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551471 kubelet[2942]: W0709 13:02:18.548319 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551471 kubelet[2942]: E0709 13:02:18.548327 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551471 kubelet[2942]: E0709 13:02:18.548738 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551471 kubelet[2942]: W0709 13:02:18.548743 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551621 kubelet[2942]: E0709 13:02:18.548758 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551621 kubelet[2942]: E0709 13:02:18.548840 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551621 kubelet[2942]: W0709 13:02:18.548844 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551621 kubelet[2942]: E0709 13:02:18.548862 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551621 kubelet[2942]: E0709 13:02:18.548945 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551621 kubelet[2942]: W0709 13:02:18.548950 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551621 kubelet[2942]: E0709 13:02:18.549023 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551621 kubelet[2942]: E0709 13:02:18.549045 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551621 kubelet[2942]: W0709 13:02:18.549049 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551621 kubelet[2942]: E0709 13:02:18.549076 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551896 kubelet[2942]: E0709 13:02:18.549296 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551896 kubelet[2942]: W0709 13:02:18.549302 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551896 kubelet[2942]: E0709 13:02:18.549316 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551896 kubelet[2942]: E0709 13:02:18.549401 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551896 kubelet[2942]: W0709 13:02:18.549406 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551896 kubelet[2942]: E0709 13:02:18.549413 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551896 kubelet[2942]: E0709 13:02:18.549513 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.551896 kubelet[2942]: W0709 13:02:18.549517 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.551896 kubelet[2942]: E0709 13:02:18.549522 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.551896 kubelet[2942]: E0709 13:02:18.549624 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.552108 kubelet[2942]: W0709 13:02:18.549629 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.552108 kubelet[2942]: E0709 13:02:18.549634 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.582348 containerd[1624]: time="2025-07-09T13:02:18.582315072Z" level=info msg="connecting to shim b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d" address="unix:///run/containerd/s/b088aaf54cba39f0c3369a7d1cb6ff43992596cccd0b181b5208be8be740f28c" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:18.609837 systemd[1]: Started cri-containerd-b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d.scope - libcontainer container b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d. Jul 9 13:02:18.632216 containerd[1624]: time="2025-07-09T13:02:18.632138864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8gbzx,Uid:aac77ff3-f3ff-42e8-a22e-2984fd05cd5b,Namespace:calico-system,Attempt:0,} returns sandbox id \"b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d\"" Jul 9 13:02:18.649308 kubelet[2942]: E0709 13:02:18.649244 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.649308 kubelet[2942]: W0709 13:02:18.649261 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.649308 kubelet[2942]: E0709 13:02:18.649274 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.649513 kubelet[2942]: E0709 13:02:18.649388 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.649513 kubelet[2942]: W0709 13:02:18.649393 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.649513 kubelet[2942]: E0709 13:02:18.649399 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.649513 kubelet[2942]: E0709 13:02:18.649485 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.649513 kubelet[2942]: W0709 13:02:18.649492 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.649513 kubelet[2942]: E0709 13:02:18.649498 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.649744 kubelet[2942]: E0709 13:02:18.649582 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.649744 kubelet[2942]: W0709 13:02:18.649586 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.649744 kubelet[2942]: E0709 13:02:18.649591 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.649744 kubelet[2942]: E0709 13:02:18.649716 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.649744 kubelet[2942]: W0709 13:02:18.649721 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.649744 kubelet[2942]: E0709 13:02:18.649736 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658056 kubelet[2942]: E0709 13:02:18.649872 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658056 kubelet[2942]: W0709 13:02:18.649879 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658056 kubelet[2942]: E0709 13:02:18.649905 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658056 kubelet[2942]: E0709 13:02:18.649991 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658056 kubelet[2942]: W0709 13:02:18.649996 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658056 kubelet[2942]: E0709 13:02:18.650001 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658056 kubelet[2942]: E0709 13:02:18.650080 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658056 kubelet[2942]: W0709 13:02:18.650086 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658056 kubelet[2942]: E0709 13:02:18.650121 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658056 kubelet[2942]: E0709 13:02:18.650222 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658229 kubelet[2942]: W0709 13:02:18.650227 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658229 kubelet[2942]: E0709 13:02:18.650244 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658229 kubelet[2942]: E0709 13:02:18.650544 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658229 kubelet[2942]: W0709 13:02:18.650553 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658229 kubelet[2942]: E0709 13:02:18.650573 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658229 kubelet[2942]: E0709 13:02:18.650783 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658229 kubelet[2942]: W0709 13:02:18.650788 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658229 kubelet[2942]: E0709 13:02:18.650797 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658229 kubelet[2942]: E0709 13:02:18.651102 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658229 kubelet[2942]: W0709 13:02:18.651107 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658433 kubelet[2942]: E0709 13:02:18.651124 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658433 kubelet[2942]: E0709 13:02:18.651317 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658433 kubelet[2942]: W0709 13:02:18.651322 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658433 kubelet[2942]: E0709 13:02:18.651337 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658433 kubelet[2942]: E0709 13:02:18.651701 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658433 kubelet[2942]: W0709 13:02:18.651707 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658433 kubelet[2942]: E0709 13:02:18.651721 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658433 kubelet[2942]: E0709 13:02:18.651799 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658433 kubelet[2942]: W0709 13:02:18.651807 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658433 kubelet[2942]: E0709 13:02:18.651820 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658603 kubelet[2942]: E0709 13:02:18.652484 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658603 kubelet[2942]: W0709 13:02:18.652491 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658603 kubelet[2942]: E0709 13:02:18.652500 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658603 kubelet[2942]: E0709 13:02:18.652627 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658603 kubelet[2942]: W0709 13:02:18.652632 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658603 kubelet[2942]: E0709 13:02:18.652641 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658603 kubelet[2942]: E0709 13:02:18.652782 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658603 kubelet[2942]: W0709 13:02:18.652787 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658603 kubelet[2942]: E0709 13:02:18.652796 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658603 kubelet[2942]: E0709 13:02:18.652951 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658796 kubelet[2942]: W0709 13:02:18.652956 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658796 kubelet[2942]: E0709 13:02:18.652962 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658796 kubelet[2942]: E0709 13:02:18.653037 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658796 kubelet[2942]: W0709 13:02:18.653042 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658796 kubelet[2942]: E0709 13:02:18.653046 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658796 kubelet[2942]: E0709 13:02:18.653149 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658796 kubelet[2942]: W0709 13:02:18.653153 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658796 kubelet[2942]: E0709 13:02:18.653158 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658796 kubelet[2942]: E0709 13:02:18.653240 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658796 kubelet[2942]: W0709 13:02:18.653245 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658954 kubelet[2942]: E0709 13:02:18.653261 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658954 kubelet[2942]: E0709 13:02:18.653336 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658954 kubelet[2942]: W0709 13:02:18.653340 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658954 kubelet[2942]: E0709 13:02:18.653345 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658954 kubelet[2942]: E0709 13:02:18.653435 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658954 kubelet[2942]: W0709 13:02:18.653440 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658954 kubelet[2942]: E0709 13:02:18.653444 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.658954 kubelet[2942]: E0709 13:02:18.658225 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.658954 kubelet[2942]: W0709 13:02:18.658236 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.658954 kubelet[2942]: E0709 13:02:18.658244 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:18.674545 kubelet[2942]: E0709 13:02:18.674520 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:18.674631 kubelet[2942]: W0709 13:02:18.674540 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:18.674631 kubelet[2942]: E0709 13:02:18.674567 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:19.901958 kubelet[2942]: E0709 13:02:19.901919 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:21.901997 kubelet[2942]: E0709 13:02:21.901961 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:22.042472 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount967432589.mount: Deactivated successfully. Jul 9 13:02:22.463087 containerd[1624]: time="2025-07-09T13:02:22.462633741Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:22.463478 containerd[1624]: time="2025-07-09T13:02:22.463238454Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Jul 9 13:02:22.469436 containerd[1624]: time="2025-07-09T13:02:22.469381142Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:22.470573 containerd[1624]: time="2025-07-09T13:02:22.470548017Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:22.471696 containerd[1624]: time="2025-07-09T13:02:22.471036958Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 4.218383032s" Jul 9 13:02:22.471696 containerd[1624]: time="2025-07-09T13:02:22.471062815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Jul 9 13:02:22.475152 containerd[1624]: time="2025-07-09T13:02:22.475065926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 9 13:02:22.487215 containerd[1624]: time="2025-07-09T13:02:22.487189127Z" level=info msg="CreateContainer within sandbox \"15152d91e4140fd5db1c79d647e587e3b2d38b396ce6842cbd9d6d43325c145e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 9 13:02:22.511415 containerd[1624]: time="2025-07-09T13:02:22.510786681Z" level=info msg="Container d596943f4bb2ddc1ab9989ffc582c5521eea3692c28d3c3a2530b1c2cc9aa937: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:22.526916 containerd[1624]: time="2025-07-09T13:02:22.526894128Z" level=info msg="CreateContainer within sandbox \"15152d91e4140fd5db1c79d647e587e3b2d38b396ce6842cbd9d6d43325c145e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d596943f4bb2ddc1ab9989ffc582c5521eea3692c28d3c3a2530b1c2cc9aa937\"" Jul 9 13:02:22.527614 containerd[1624]: time="2025-07-09T13:02:22.527602868Z" level=info msg="StartContainer for \"d596943f4bb2ddc1ab9989ffc582c5521eea3692c28d3c3a2530b1c2cc9aa937\"" Jul 9 13:02:22.528441 containerd[1624]: time="2025-07-09T13:02:22.528415963Z" level=info msg="connecting to shim d596943f4bb2ddc1ab9989ffc582c5521eea3692c28d3c3a2530b1c2cc9aa937" address="unix:///run/containerd/s/b0fcd0eb53968c04170f604e3886dc858505fc519a608290401deb579ea4273a" protocol=ttrpc version=3 Jul 9 13:02:22.552803 systemd[1]: Started cri-containerd-d596943f4bb2ddc1ab9989ffc582c5521eea3692c28d3c3a2530b1c2cc9aa937.scope - libcontainer container d596943f4bb2ddc1ab9989ffc582c5521eea3692c28d3c3a2530b1c2cc9aa937. Jul 9 13:02:22.597906 containerd[1624]: time="2025-07-09T13:02:22.597872377Z" level=info msg="StartContainer for \"d596943f4bb2ddc1ab9989ffc582c5521eea3692c28d3c3a2530b1c2cc9aa937\" returns successfully" Jul 9 13:02:22.990935 kubelet[2942]: I0709 13:02:22.990900 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-578d79bf54-zdvbv" podStartSLOduration=1.768767423 podStartE2EDuration="5.990884391s" podCreationTimestamp="2025-07-09 13:02:17 +0000 UTC" firstStartedPulling="2025-07-09 13:02:18.252407096 +0000 UTC m=+21.516172606" lastFinishedPulling="2025-07-09 13:02:22.47452406 +0000 UTC m=+25.738289574" observedRunningTime="2025-07-09 13:02:22.990078932 +0000 UTC m=+26.253844444" watchObservedRunningTime="2025-07-09 13:02:22.990884391 +0000 UTC m=+26.254649910" Jul 9 13:02:23.009319 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1225260182.mount: Deactivated successfully. Jul 9 13:02:23.078898 kubelet[2942]: E0709 13:02:23.078865 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.078898 kubelet[2942]: W0709 13:02:23.078895 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.079028 kubelet[2942]: E0709 13:02:23.078918 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.079095 kubelet[2942]: E0709 13:02:23.079084 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.079095 kubelet[2942]: W0709 13:02:23.079094 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.079140 kubelet[2942]: E0709 13:02:23.079103 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.079206 kubelet[2942]: E0709 13:02:23.079195 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.079231 kubelet[2942]: W0709 13:02:23.079205 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.079231 kubelet[2942]: E0709 13:02:23.079214 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.079369 kubelet[2942]: E0709 13:02:23.079359 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.079369 kubelet[2942]: W0709 13:02:23.079368 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.079409 kubelet[2942]: E0709 13:02:23.079376 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.079536 kubelet[2942]: E0709 13:02:23.079523 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.079536 kubelet[2942]: W0709 13:02:23.079534 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.079585 kubelet[2942]: E0709 13:02:23.079543 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.079666 kubelet[2942]: E0709 13:02:23.079654 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.079666 kubelet[2942]: W0709 13:02:23.079664 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.079737 kubelet[2942]: E0709 13:02:23.079723 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.079851 kubelet[2942]: E0709 13:02:23.079840 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.079851 kubelet[2942]: W0709 13:02:23.079850 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.079900 kubelet[2942]: E0709 13:02:23.079859 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.079970 kubelet[2942]: E0709 13:02:23.079957 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.079970 kubelet[2942]: W0709 13:02:23.079967 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.080011 kubelet[2942]: E0709 13:02:23.079974 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.080084 kubelet[2942]: E0709 13:02:23.080076 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.080104 kubelet[2942]: W0709 13:02:23.080085 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.080104 kubelet[2942]: E0709 13:02:23.080092 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.080189 kubelet[2942]: E0709 13:02:23.080179 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.080214 kubelet[2942]: W0709 13:02:23.080189 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.080214 kubelet[2942]: E0709 13:02:23.080197 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.080300 kubelet[2942]: E0709 13:02:23.080291 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.080300 kubelet[2942]: W0709 13:02:23.080299 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.080348 kubelet[2942]: E0709 13:02:23.080306 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.080403 kubelet[2942]: E0709 13:02:23.080392 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.080431 kubelet[2942]: W0709 13:02:23.080403 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.080431 kubelet[2942]: E0709 13:02:23.080410 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.080566 kubelet[2942]: E0709 13:02:23.080555 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.080587 kubelet[2942]: W0709 13:02:23.080565 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.080587 kubelet[2942]: E0709 13:02:23.080572 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.080687 kubelet[2942]: E0709 13:02:23.080664 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.080713 kubelet[2942]: W0709 13:02:23.080694 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.080713 kubelet[2942]: E0709 13:02:23.080703 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.081516 kubelet[2942]: E0709 13:02:23.080807 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.081516 kubelet[2942]: W0709 13:02:23.080813 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.081516 kubelet[2942]: E0709 13:02:23.080819 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.081516 kubelet[2942]: E0709 13:02:23.080974 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.081516 kubelet[2942]: W0709 13:02:23.080981 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.081516 kubelet[2942]: E0709 13:02:23.080988 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.081516 kubelet[2942]: E0709 13:02:23.081101 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.081516 kubelet[2942]: W0709 13:02:23.081109 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.081516 kubelet[2942]: E0709 13:02:23.081116 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.081516 kubelet[2942]: E0709 13:02:23.081251 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.082219 kubelet[2942]: W0709 13:02:23.081257 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.082219 kubelet[2942]: E0709 13:02:23.081267 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.082219 kubelet[2942]: E0709 13:02:23.081395 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.082219 kubelet[2942]: W0709 13:02:23.081403 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.082219 kubelet[2942]: E0709 13:02:23.081419 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.082219 kubelet[2942]: E0709 13:02:23.082019 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.082219 kubelet[2942]: W0709 13:02:23.082025 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.082219 kubelet[2942]: E0709 13:02:23.082034 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.082219 kubelet[2942]: E0709 13:02:23.082128 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.082219 kubelet[2942]: W0709 13:02:23.082132 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.082407 kubelet[2942]: E0709 13:02:23.082141 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.082568 kubelet[2942]: E0709 13:02:23.082449 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.082568 kubelet[2942]: W0709 13:02:23.082455 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.082568 kubelet[2942]: E0709 13:02:23.082463 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.082656 kubelet[2942]: E0709 13:02:23.082642 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.082656 kubelet[2942]: W0709 13:02:23.082653 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.082745 kubelet[2942]: E0709 13:02:23.082664 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.082799 kubelet[2942]: E0709 13:02:23.082788 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.082820 kubelet[2942]: W0709 13:02:23.082798 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.082896 kubelet[2942]: E0709 13:02:23.082883 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.082928 kubelet[2942]: E0709 13:02:23.082917 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.082928 kubelet[2942]: W0709 13:02:23.082926 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.083003 kubelet[2942]: E0709 13:02:23.082981 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.083030 kubelet[2942]: E0709 13:02:23.083024 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.083056 kubelet[2942]: W0709 13:02:23.083031 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.083056 kubelet[2942]: E0709 13:02:23.083041 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.083163 kubelet[2942]: E0709 13:02:23.083152 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.083163 kubelet[2942]: W0709 13:02:23.083162 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.083210 kubelet[2942]: E0709 13:02:23.083178 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.083325 kubelet[2942]: E0709 13:02:23.083314 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.083325 kubelet[2942]: W0709 13:02:23.083323 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.083371 kubelet[2942]: E0709 13:02:23.083333 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.084169 kubelet[2942]: E0709 13:02:23.083915 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.084169 kubelet[2942]: W0709 13:02:23.083930 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.084169 kubelet[2942]: E0709 13:02:23.083938 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.084287 kubelet[2942]: E0709 13:02:23.084281 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.084391 kubelet[2942]: W0709 13:02:23.084319 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.084391 kubelet[2942]: E0709 13:02:23.084329 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.084474 kubelet[2942]: E0709 13:02:23.084469 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.084657 kubelet[2942]: W0709 13:02:23.084506 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.084657 kubelet[2942]: E0709 13:02:23.084514 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.084747 kubelet[2942]: E0709 13:02:23.084741 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.084783 kubelet[2942]: W0709 13:02:23.084778 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.084818 kubelet[2942]: E0709 13:02:23.084813 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.089719 kubelet[2942]: E0709 13:02:23.089707 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.090720 kubelet[2942]: W0709 13:02:23.089882 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.090720 kubelet[2942]: E0709 13:02:23.089896 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.901508 kubelet[2942]: E0709 13:02:23.901474 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:23.979410 kubelet[2942]: I0709 13:02:23.979383 2942 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 9 13:02:23.987062 kubelet[2942]: E0709 13:02:23.986993 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.987062 kubelet[2942]: W0709 13:02:23.987007 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.987062 kubelet[2942]: E0709 13:02:23.987019 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.987255 kubelet[2942]: E0709 13:02:23.987201 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.987255 kubelet[2942]: W0709 13:02:23.987208 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.987255 kubelet[2942]: E0709 13:02:23.987213 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.987560 kubelet[2942]: E0709 13:02:23.987388 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.987560 kubelet[2942]: W0709 13:02:23.987394 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.987560 kubelet[2942]: E0709 13:02:23.987401 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.988413 kubelet[2942]: E0709 13:02:23.988402 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.988488 kubelet[2942]: W0709 13:02:23.988463 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.988488 kubelet[2942]: E0709 13:02:23.988475 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.988746 kubelet[2942]: E0709 13:02:23.988713 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.988746 kubelet[2942]: W0709 13:02:23.988720 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.988746 kubelet[2942]: E0709 13:02:23.988727 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.988932 kubelet[2942]: E0709 13:02:23.988896 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.988932 kubelet[2942]: W0709 13:02:23.988905 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.988932 kubelet[2942]: E0709 13:02:23.988913 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.989113 kubelet[2942]: E0709 13:02:23.989077 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.989113 kubelet[2942]: W0709 13:02:23.989083 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.989113 kubelet[2942]: E0709 13:02:23.989089 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.989341 kubelet[2942]: E0709 13:02:23.989273 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.989341 kubelet[2942]: W0709 13:02:23.989279 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.989341 kubelet[2942]: E0709 13:02:23.989284 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.989902 kubelet[2942]: E0709 13:02:23.989865 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.989902 kubelet[2942]: W0709 13:02:23.989873 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.989902 kubelet[2942]: E0709 13:02:23.989880 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.990113 kubelet[2942]: E0709 13:02:23.990081 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.990113 kubelet[2942]: W0709 13:02:23.990089 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.990113 kubelet[2942]: E0709 13:02:23.990095 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.990361 kubelet[2942]: E0709 13:02:23.990355 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.990434 kubelet[2942]: W0709 13:02:23.990400 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.990434 kubelet[2942]: E0709 13:02:23.990410 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.990692 kubelet[2942]: E0709 13:02:23.990599 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.990692 kubelet[2942]: W0709 13:02:23.990608 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.990692 kubelet[2942]: E0709 13:02:23.990617 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.990828 kubelet[2942]: E0709 13:02:23.990822 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.990871 kubelet[2942]: W0709 13:02:23.990863 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.991002 kubelet[2942]: E0709 13:02:23.990943 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.991320 kubelet[2942]: E0709 13:02:23.991134 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.991320 kubelet[2942]: W0709 13:02:23.991139 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.991320 kubelet[2942]: E0709 13:02:23.991144 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.991925 kubelet[2942]: E0709 13:02:23.991784 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.991925 kubelet[2942]: W0709 13:02:23.991793 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.991925 kubelet[2942]: E0709 13:02:23.991800 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.992088 kubelet[2942]: E0709 13:02:23.992064 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.992088 kubelet[2942]: W0709 13:02:23.992070 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.992088 kubelet[2942]: E0709 13:02:23.992078 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.992381 kubelet[2942]: E0709 13:02:23.992313 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.992381 kubelet[2942]: W0709 13:02:23.992319 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.992381 kubelet[2942]: E0709 13:02:23.992330 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.992502 kubelet[2942]: E0709 13:02:23.992488 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.992502 kubelet[2942]: W0709 13:02:23.992495 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.992609 kubelet[2942]: E0709 13:02:23.992556 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.992740 kubelet[2942]: E0709 13:02:23.992726 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.992740 kubelet[2942]: W0709 13:02:23.992733 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.992852 kubelet[2942]: E0709 13:02:23.992796 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.992911 kubelet[2942]: E0709 13:02:23.992906 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.992944 kubelet[2942]: W0709 13:02:23.992939 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.992983 kubelet[2942]: E0709 13:02:23.992977 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.993102 kubelet[2942]: E0709 13:02:23.993087 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.993132 kubelet[2942]: W0709 13:02:23.993101 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.993132 kubelet[2942]: E0709 13:02:23.993116 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.993226 kubelet[2942]: E0709 13:02:23.993216 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.993226 kubelet[2942]: W0709 13:02:23.993224 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.993284 kubelet[2942]: E0709 13:02:23.993235 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.993328 kubelet[2942]: E0709 13:02:23.993319 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.993353 kubelet[2942]: W0709 13:02:23.993328 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.993353 kubelet[2942]: E0709 13:02:23.993336 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.993445 kubelet[2942]: E0709 13:02:23.993434 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.993445 kubelet[2942]: W0709 13:02:23.993445 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.993491 kubelet[2942]: E0709 13:02:23.993453 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995236 kubelet[2942]: E0709 13:02:23.993638 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995236 kubelet[2942]: W0709 13:02:23.993643 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995236 kubelet[2942]: E0709 13:02:23.993649 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995236 kubelet[2942]: E0709 13:02:23.993783 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995236 kubelet[2942]: W0709 13:02:23.993794 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995236 kubelet[2942]: E0709 13:02:23.993800 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995236 kubelet[2942]: E0709 13:02:23.993917 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995236 kubelet[2942]: W0709 13:02:23.993923 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995236 kubelet[2942]: E0709 13:02:23.993931 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995236 kubelet[2942]: E0709 13:02:23.994041 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995426 kubelet[2942]: W0709 13:02:23.994049 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995426 kubelet[2942]: E0709 13:02:23.994056 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995426 kubelet[2942]: E0709 13:02:23.994165 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995426 kubelet[2942]: W0709 13:02:23.994172 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995426 kubelet[2942]: E0709 13:02:23.994179 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995426 kubelet[2942]: E0709 13:02:23.994408 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995426 kubelet[2942]: W0709 13:02:23.994414 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995426 kubelet[2942]: E0709 13:02:23.994421 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995426 kubelet[2942]: E0709 13:02:23.994521 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995426 kubelet[2942]: W0709 13:02:23.994526 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995596 kubelet[2942]: E0709 13:02:23.994533 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995596 kubelet[2942]: E0709 13:02:23.994639 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995596 kubelet[2942]: W0709 13:02:23.994650 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995596 kubelet[2942]: E0709 13:02:23.994658 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:23.995596 kubelet[2942]: E0709 13:02:23.995008 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:23.995596 kubelet[2942]: W0709 13:02:23.995017 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:23.995596 kubelet[2942]: E0709 13:02:23.995023 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:24.999008 kubelet[2942]: E0709 13:02:24.998952 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:24.999008 kubelet[2942]: W0709 13:02:24.998967 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:24.999008 kubelet[2942]: E0709 13:02:24.998981 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:24.999476 kubelet[2942]: E0709 13:02:24.999402 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:24.999476 kubelet[2942]: W0709 13:02:24.999409 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:24.999476 kubelet[2942]: E0709 13:02:24.999414 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003075 kubelet[2942]: E0709 13:02:24.999517 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003075 kubelet[2942]: W0709 13:02:24.999521 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003075 kubelet[2942]: E0709 13:02:24.999526 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003075 kubelet[2942]: E0709 13:02:24.999787 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003075 kubelet[2942]: W0709 13:02:24.999792 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003075 kubelet[2942]: E0709 13:02:24.999798 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003075 kubelet[2942]: E0709 13:02:24.999887 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003075 kubelet[2942]: W0709 13:02:24.999892 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003075 kubelet[2942]: E0709 13:02:24.999897 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003075 kubelet[2942]: E0709 13:02:24.999991 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003254 kubelet[2942]: W0709 13:02:24.999995 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003254 kubelet[2942]: E0709 13:02:25.000013 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003254 kubelet[2942]: E0709 13:02:25.000108 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003254 kubelet[2942]: W0709 13:02:25.000113 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003254 kubelet[2942]: E0709 13:02:25.000117 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003254 kubelet[2942]: E0709 13:02:25.000198 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003254 kubelet[2942]: W0709 13:02:25.000202 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003254 kubelet[2942]: E0709 13:02:25.000207 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003254 kubelet[2942]: E0709 13:02:25.000298 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003254 kubelet[2942]: W0709 13:02:25.000303 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003410 kubelet[2942]: E0709 13:02:25.000307 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003410 kubelet[2942]: E0709 13:02:25.000409 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003410 kubelet[2942]: W0709 13:02:25.000414 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003410 kubelet[2942]: E0709 13:02:25.000418 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003410 kubelet[2942]: E0709 13:02:25.000494 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003410 kubelet[2942]: W0709 13:02:25.000498 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003410 kubelet[2942]: E0709 13:02:25.000502 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003410 kubelet[2942]: E0709 13:02:25.000586 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003410 kubelet[2942]: W0709 13:02:25.000595 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003410 kubelet[2942]: E0709 13:02:25.000599 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003564 kubelet[2942]: E0709 13:02:25.000688 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003564 kubelet[2942]: W0709 13:02:25.000694 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003564 kubelet[2942]: E0709 13:02:25.000699 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003564 kubelet[2942]: E0709 13:02:25.000780 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003564 kubelet[2942]: W0709 13:02:25.000784 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003564 kubelet[2942]: E0709 13:02:25.000789 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.003564 kubelet[2942]: E0709 13:02:25.000871 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.003564 kubelet[2942]: W0709 13:02:25.000876 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.003564 kubelet[2942]: E0709 13:02:25.000880 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.099249 kubelet[2942]: E0709 13:02:25.099226 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.099249 kubelet[2942]: W0709 13:02:25.099241 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.099249 kubelet[2942]: E0709 13:02:25.099255 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.099392 kubelet[2942]: E0709 13:02:25.099372 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.099392 kubelet[2942]: W0709 13:02:25.099376 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.099392 kubelet[2942]: E0709 13:02:25.099381 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.099518 kubelet[2942]: E0709 13:02:25.099463 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.099518 kubelet[2942]: W0709 13:02:25.099470 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.099518 kubelet[2942]: E0709 13:02:25.099475 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.099828 kubelet[2942]: E0709 13:02:25.099570 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.099828 kubelet[2942]: W0709 13:02:25.099578 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.099828 kubelet[2942]: E0709 13:02:25.099590 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.099828 kubelet[2942]: E0709 13:02:25.099741 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.099828 kubelet[2942]: W0709 13:02:25.099746 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.099828 kubelet[2942]: E0709 13:02:25.099754 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.099955 kubelet[2942]: E0709 13:02:25.099949 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.099986 kubelet[2942]: W0709 13:02:25.099980 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.100025 kubelet[2942]: E0709 13:02:25.100018 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.100259 kubelet[2942]: E0709 13:02:25.100153 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.100259 kubelet[2942]: W0709 13:02:25.100160 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.100259 kubelet[2942]: E0709 13:02:25.100168 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.100330 kubelet[2942]: E0709 13:02:25.100312 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.100330 kubelet[2942]: W0709 13:02:25.100317 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.100330 kubelet[2942]: E0709 13:02:25.100325 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.100416 kubelet[2942]: E0709 13:02:25.100404 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.100416 kubelet[2942]: W0709 13:02:25.100411 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.100416 kubelet[2942]: E0709 13:02:25.100417 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.102736 kubelet[2942]: E0709 13:02:25.100489 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.102736 kubelet[2942]: W0709 13:02:25.100494 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.102736 kubelet[2942]: E0709 13:02:25.100499 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.102736 kubelet[2942]: E0709 13:02:25.100591 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.102736 kubelet[2942]: W0709 13:02:25.100596 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.102736 kubelet[2942]: E0709 13:02:25.100603 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.102736 kubelet[2942]: E0709 13:02:25.100791 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.102736 kubelet[2942]: W0709 13:02:25.100797 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.102736 kubelet[2942]: E0709 13:02:25.100811 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.102736 kubelet[2942]: E0709 13:02:25.100888 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.103297 kubelet[2942]: W0709 13:02:25.100893 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.103297 kubelet[2942]: E0709 13:02:25.100901 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.103297 kubelet[2942]: E0709 13:02:25.100973 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.103297 kubelet[2942]: W0709 13:02:25.100977 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.103297 kubelet[2942]: E0709 13:02:25.100982 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.103297 kubelet[2942]: E0709 13:02:25.101077 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.103297 kubelet[2942]: W0709 13:02:25.101082 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.103297 kubelet[2942]: E0709 13:02:25.101089 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.103297 kubelet[2942]: E0709 13:02:25.101268 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.103297 kubelet[2942]: W0709 13:02:25.101277 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.103456 kubelet[2942]: E0709 13:02:25.101287 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.103456 kubelet[2942]: E0709 13:02:25.101426 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.103456 kubelet[2942]: W0709 13:02:25.101433 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.103456 kubelet[2942]: E0709 13:02:25.101447 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.103456 kubelet[2942]: E0709 13:02:25.101583 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:25.103456 kubelet[2942]: W0709 13:02:25.101590 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:25.103456 kubelet[2942]: E0709 13:02:25.101597 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:25.901978 kubelet[2942]: E0709 13:02:25.901942 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:26.005194 kubelet[2942]: E0709 13:02:26.005125 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.005194 kubelet[2942]: W0709 13:02:26.005141 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.005194 kubelet[2942]: E0709 13:02:26.005155 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.005604 kubelet[2942]: E0709 13:02:26.005535 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.005604 kubelet[2942]: W0709 13:02:26.005541 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.005604 kubelet[2942]: E0709 13:02:26.005556 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.005765 kubelet[2942]: E0709 13:02:26.005656 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.005765 kubelet[2942]: W0709 13:02:26.005660 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.005765 kubelet[2942]: E0709 13:02:26.005665 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.005941 kubelet[2942]: E0709 13:02:26.005919 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.005941 kubelet[2942]: W0709 13:02:26.005926 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.005941 kubelet[2942]: E0709 13:02:26.005931 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.006134 kubelet[2942]: E0709 13:02:26.006128 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.006230 kubelet[2942]: W0709 13:02:26.006175 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.006230 kubelet[2942]: E0709 13:02:26.006185 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.006313 kubelet[2942]: E0709 13:02:26.006307 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.006397 kubelet[2942]: W0709 13:02:26.006342 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.006397 kubelet[2942]: E0709 13:02:26.006349 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.006524 kubelet[2942]: E0709 13:02:26.006462 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.006524 kubelet[2942]: W0709 13:02:26.006468 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.006524 kubelet[2942]: E0709 13:02:26.006472 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.006616 kubelet[2942]: E0709 13:02:26.006610 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.006651 kubelet[2942]: W0709 13:02:26.006645 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.006763 kubelet[2942]: E0709 13:02:26.006679 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.006895 kubelet[2942]: E0709 13:02:26.006888 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.006988 kubelet[2942]: W0709 13:02:26.006929 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.006988 kubelet[2942]: E0709 13:02:26.006937 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.007073 kubelet[2942]: E0709 13:02:26.007068 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.007125 kubelet[2942]: W0709 13:02:26.007101 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.007125 kubelet[2942]: E0709 13:02:26.007109 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.007252 kubelet[2942]: E0709 13:02:26.007240 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.007331 kubelet[2942]: W0709 13:02:26.007283 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.007331 kubelet[2942]: E0709 13:02:26.007291 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.007421 kubelet[2942]: E0709 13:02:26.007415 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.007455 kubelet[2942]: W0709 13:02:26.007450 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.007541 kubelet[2942]: E0709 13:02:26.007476 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.007594 kubelet[2942]: E0709 13:02:26.007589 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.007627 kubelet[2942]: W0709 13:02:26.007622 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.007701 kubelet[2942]: E0709 13:02:26.007652 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.007819 kubelet[2942]: E0709 13:02:26.007813 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.007881 kubelet[2942]: W0709 13:02:26.007852 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.007881 kubelet[2942]: E0709 13:02:26.007860 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.008082 kubelet[2942]: E0709 13:02:26.008013 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.008082 kubelet[2942]: W0709 13:02:26.008020 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.008082 kubelet[2942]: E0709 13:02:26.008025 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.008227 kubelet[2942]: E0709 13:02:26.008212 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.008264 kubelet[2942]: W0709 13:02:26.008258 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.008375 kubelet[2942]: E0709 13:02:26.008303 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.008449 kubelet[2942]: E0709 13:02:26.008444 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.008484 kubelet[2942]: W0709 13:02:26.008478 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.008535 kubelet[2942]: E0709 13:02:26.008527 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.008729 kubelet[2942]: E0709 13:02:26.008662 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.008729 kubelet[2942]: W0709 13:02:26.008693 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.008729 kubelet[2942]: E0709 13:02:26.008703 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.008839 kubelet[2942]: E0709 13:02:26.008820 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.008839 kubelet[2942]: W0709 13:02:26.008835 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.008899 kubelet[2942]: E0709 13:02:26.008847 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.008956 kubelet[2942]: E0709 13:02:26.008945 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.008956 kubelet[2942]: W0709 13:02:26.008954 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.009021 kubelet[2942]: E0709 13:02:26.008974 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.009086 kubelet[2942]: E0709 13:02:26.009074 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.009166 kubelet[2942]: W0709 13:02:26.009084 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.009166 kubelet[2942]: E0709 13:02:26.009095 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.009232 kubelet[2942]: E0709 13:02:26.009226 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.009285 kubelet[2942]: W0709 13:02:26.009260 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.009285 kubelet[2942]: E0709 13:02:26.009272 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.009371 kubelet[2942]: E0709 13:02:26.009359 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.009371 kubelet[2942]: W0709 13:02:26.009368 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.009428 kubelet[2942]: E0709 13:02:26.009378 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.009540 kubelet[2942]: E0709 13:02:26.009466 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.009540 kubelet[2942]: W0709 13:02:26.009476 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.009540 kubelet[2942]: E0709 13:02:26.009483 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.009634 kubelet[2942]: E0709 13:02:26.009628 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.009695 kubelet[2942]: W0709 13:02:26.009663 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.009741 kubelet[2942]: E0709 13:02:26.009734 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.009829 kubelet[2942]: E0709 13:02:26.009820 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.009829 kubelet[2942]: W0709 13:02:26.009828 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.009871 kubelet[2942]: E0709 13:02:26.009836 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.009921 kubelet[2942]: E0709 13:02:26.009912 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.009921 kubelet[2942]: W0709 13:02:26.009920 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.009959 kubelet[2942]: E0709 13:02:26.009926 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.010031 kubelet[2942]: E0709 13:02:26.010020 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.010031 kubelet[2942]: W0709 13:02:26.010029 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.010075 kubelet[2942]: E0709 13:02:26.010037 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.010220 kubelet[2942]: E0709 13:02:26.010210 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.010220 kubelet[2942]: W0709 13:02:26.010218 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.010266 kubelet[2942]: E0709 13:02:26.010225 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.010327 kubelet[2942]: E0709 13:02:26.010314 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.010327 kubelet[2942]: W0709 13:02:26.010323 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.010369 kubelet[2942]: E0709 13:02:26.010338 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.010469 kubelet[2942]: E0709 13:02:26.010458 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.010469 kubelet[2942]: W0709 13:02:26.010467 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.010515 kubelet[2942]: E0709 13:02:26.010477 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.010702 kubelet[2942]: E0709 13:02:26.010691 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.010734 kubelet[2942]: W0709 13:02:26.010702 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.010734 kubelet[2942]: E0709 13:02:26.010713 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:26.010850 kubelet[2942]: E0709 13:02:26.010840 2942 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 13:02:26.010850 kubelet[2942]: W0709 13:02:26.010848 2942 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 13:02:26.010893 kubelet[2942]: E0709 13:02:26.010856 2942 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 13:02:27.901933 kubelet[2942]: E0709 13:02:27.901900 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:29.277575 containerd[1624]: time="2025-07-09T13:02:29.277227739Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:29.278055 containerd[1624]: time="2025-07-09T13:02:29.278042970Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Jul 9 13:02:29.278580 containerd[1624]: time="2025-07-09T13:02:29.278568095Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:29.279724 containerd[1624]: time="2025-07-09T13:02:29.279711307Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:29.280236 containerd[1624]: time="2025-07-09T13:02:29.280189832Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 6.805086834s" Jul 9 13:02:29.280236 containerd[1624]: time="2025-07-09T13:02:29.280207187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Jul 9 13:02:29.282714 containerd[1624]: time="2025-07-09T13:02:29.282638945Z" level=info msg="CreateContainer within sandbox \"b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 9 13:02:29.286941 containerd[1624]: time="2025-07-09T13:02:29.286854311Z" level=info msg="Container a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:29.291371 containerd[1624]: time="2025-07-09T13:02:29.291263158Z" level=info msg="CreateContainer within sandbox \"b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83\"" Jul 9 13:02:29.291742 containerd[1624]: time="2025-07-09T13:02:29.291724921Z" level=info msg="StartContainer for \"a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83\"" Jul 9 13:02:29.292578 containerd[1624]: time="2025-07-09T13:02:29.292550998Z" level=info msg="connecting to shim a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83" address="unix:///run/containerd/s/b088aaf54cba39f0c3369a7d1cb6ff43992596cccd0b181b5208be8be740f28c" protocol=ttrpc version=3 Jul 9 13:02:29.311812 systemd[1]: Started cri-containerd-a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83.scope - libcontainer container a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83. Jul 9 13:02:29.351585 systemd[1]: cri-containerd-a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83.scope: Deactivated successfully. Jul 9 13:02:29.358142 containerd[1624]: time="2025-07-09T13:02:29.358111505Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83\" id:\"a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83\" pid:3711 exited_at:{seconds:1752066149 nanos:353551997}" Jul 9 13:02:29.365124 containerd[1624]: time="2025-07-09T13:02:29.364881139Z" level=info msg="StartContainer for \"a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83\" returns successfully" Jul 9 13:02:29.370482 containerd[1624]: time="2025-07-09T13:02:29.370362756Z" level=info msg="received exit event container_id:\"a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83\" id:\"a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83\" pid:3711 exited_at:{seconds:1752066149 nanos:353551997}" Jul 9 13:02:29.387723 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a304c41e1a17bd31f910cc6d6ece1b57c408cbba5a3c72a710908310f5341e83-rootfs.mount: Deactivated successfully. Jul 9 13:02:29.902056 kubelet[2942]: E0709 13:02:29.901815 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:29.989460 containerd[1624]: time="2025-07-09T13:02:29.989414149Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 9 13:02:31.901911 kubelet[2942]: E0709 13:02:31.901715 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:33.901066 kubelet[2942]: E0709 13:02:33.901027 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:35.823703 containerd[1624]: time="2025-07-09T13:02:35.823436979Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:35.824165 containerd[1624]: time="2025-07-09T13:02:35.824140174Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Jul 9 13:02:35.824421 containerd[1624]: time="2025-07-09T13:02:35.824406256Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:35.825839 containerd[1624]: time="2025-07-09T13:02:35.825819913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:35.826254 containerd[1624]: time="2025-07-09T13:02:35.826232672Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 5.836794557s" Jul 9 13:02:35.826291 containerd[1624]: time="2025-07-09T13:02:35.826257115Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Jul 9 13:02:35.831425 containerd[1624]: time="2025-07-09T13:02:35.831398456Z" level=info msg="CreateContainer within sandbox \"b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 9 13:02:35.837099 containerd[1624]: time="2025-07-09T13:02:35.835758180Z" level=info msg="Container 80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:35.855920 containerd[1624]: time="2025-07-09T13:02:35.855869648Z" level=info msg="CreateContainer within sandbox \"b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6\"" Jul 9 13:02:35.856576 containerd[1624]: time="2025-07-09T13:02:35.856285621Z" level=info msg="StartContainer for \"80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6\"" Jul 9 13:02:35.858063 containerd[1624]: time="2025-07-09T13:02:35.858021637Z" level=info msg="connecting to shim 80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6" address="unix:///run/containerd/s/b088aaf54cba39f0c3369a7d1cb6ff43992596cccd0b181b5208be8be740f28c" protocol=ttrpc version=3 Jul 9 13:02:35.881849 systemd[1]: Started cri-containerd-80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6.scope - libcontainer container 80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6. Jul 9 13:02:35.910348 kubelet[2942]: E0709 13:02:35.910291 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:35.928758 containerd[1624]: time="2025-07-09T13:02:35.928725934Z" level=info msg="StartContainer for \"80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6\" returns successfully" Jul 9 13:02:37.901727 kubelet[2942]: E0709 13:02:37.901685 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:37.928096 systemd[1]: cri-containerd-80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6.scope: Deactivated successfully. Jul 9 13:02:37.928563 systemd[1]: cri-containerd-80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6.scope: Consumed 324ms CPU time, 163.4M memory peak, 28K read from disk, 171.2M written to disk. Jul 9 13:02:37.961536 containerd[1624]: time="2025-07-09T13:02:37.961512731Z" level=info msg="received exit event container_id:\"80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6\" id:\"80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6\" pid:3773 exited_at:{seconds:1752066157 nanos:961325540}" Jul 9 13:02:37.962123 containerd[1624]: time="2025-07-09T13:02:37.962106268Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6\" id:\"80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6\" pid:3773 exited_at:{seconds:1752066157 nanos:961325540}" Jul 9 13:02:37.996039 kubelet[2942]: I0709 13:02:37.995626 2942 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jul 9 13:02:38.013208 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-80f613c0895232cf717e440402beab99aea0b70a8e9670778c8ceaebb47d08a6-rootfs.mount: Deactivated successfully. Jul 9 13:02:38.333049 systemd[1]: Created slice kubepods-besteffort-pode2d6563d_91df_4d1a_83da_42d96468fe10.slice - libcontainer container kubepods-besteffort-pode2d6563d_91df_4d1a_83da_42d96468fe10.slice. Jul 9 13:02:38.334388 kubelet[2942]: W0709 13:02:38.334018 2942 reflector.go:569] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Jul 9 13:02:38.335771 kubelet[2942]: E0709 13:02:38.335715 2942 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Jul 9 13:02:38.342307 systemd[1]: Created slice kubepods-besteffort-podf89946af_f0dd_4143_a764_f7e14a7dc9ea.slice - libcontainer container kubepods-besteffort-podf89946af_f0dd_4143_a764_f7e14a7dc9ea.slice. Jul 9 13:02:38.348915 systemd[1]: Created slice kubepods-burstable-pode7f4ec8f_6937_4878_a69c_40587e0b905d.slice - libcontainer container kubepods-burstable-pode7f4ec8f_6937_4878_a69c_40587e0b905d.slice. Jul 9 13:02:38.358625 systemd[1]: Created slice kubepods-besteffort-pod8e21c81c_3cf2_4c7c_a915_e968854971d4.slice - libcontainer container kubepods-besteffort-pod8e21c81c_3cf2_4c7c_a915_e968854971d4.slice. Jul 9 13:02:38.364214 systemd[1]: Created slice kubepods-burstable-pod852b8fad_39a8_4dd8_ae90_8162593e4973.slice - libcontainer container kubepods-burstable-pod852b8fad_39a8_4dd8_ae90_8162593e4973.slice. Jul 9 13:02:38.371588 systemd[1]: Created slice kubepods-besteffort-pod385d03ba_5657_4ba7_b837_dbc26e99cf88.slice - libcontainer container kubepods-besteffort-pod385d03ba_5657_4ba7_b837_dbc26e99cf88.slice. Jul 9 13:02:38.377047 systemd[1]: Created slice kubepods-besteffort-pod0eabfdb8_41ba_4268_b70a_42c305153461.slice - libcontainer container kubepods-besteffort-pod0eabfdb8_41ba_4268_b70a_42c305153461.slice. Jul 9 13:02:38.388918 kubelet[2942]: I0709 13:02:38.388888 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/385d03ba-5657-4ba7-b837-dbc26e99cf88-calico-apiserver-certs\") pod \"calico-apiserver-697d57c56-9g4vj\" (UID: \"385d03ba-5657-4ba7-b837-dbc26e99cf88\") " pod="calico-apiserver/calico-apiserver-697d57c56-9g4vj" Jul 9 13:02:38.388918 kubelet[2942]: I0709 13:02:38.388917 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hf94\" (UniqueName: \"kubernetes.io/projected/e7f4ec8f-6937-4878-a69c-40587e0b905d-kube-api-access-5hf94\") pod \"coredns-668d6bf9bc-bpjdp\" (UID: \"e7f4ec8f-6937-4878-a69c-40587e0b905d\") " pod="kube-system/coredns-668d6bf9bc-bpjdp" Jul 9 13:02:38.389036 kubelet[2942]: I0709 13:02:38.388932 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc2cx\" (UniqueName: \"kubernetes.io/projected/0eabfdb8-41ba-4268-b70a-42c305153461-kube-api-access-nc2cx\") pod \"calico-kube-controllers-6b449fff5d-5tg2l\" (UID: \"0eabfdb8-41ba-4268-b70a-42c305153461\") " pod="calico-system/calico-kube-controllers-6b449fff5d-5tg2l" Jul 9 13:02:38.389036 kubelet[2942]: I0709 13:02:38.388944 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctmcq\" (UniqueName: \"kubernetes.io/projected/8e21c81c-3cf2-4c7c-a915-e968854971d4-kube-api-access-ctmcq\") pod \"goldmane-768f4c5c69-2brrp\" (UID: \"8e21c81c-3cf2-4c7c-a915-e968854971d4\") " pod="calico-system/goldmane-768f4c5c69-2brrp" Jul 9 13:02:38.389036 kubelet[2942]: I0709 13:02:38.388955 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-backend-key-pair\") pod \"whisker-5dbc8bf55b-2ptzt\" (UID: \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\") " pod="calico-system/whisker-5dbc8bf55b-2ptzt" Jul 9 13:02:38.389036 kubelet[2942]: I0709 13:02:38.388963 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2jxw\" (UniqueName: \"kubernetes.io/projected/385d03ba-5657-4ba7-b837-dbc26e99cf88-kube-api-access-h2jxw\") pod \"calico-apiserver-697d57c56-9g4vj\" (UID: \"385d03ba-5657-4ba7-b837-dbc26e99cf88\") " pod="calico-apiserver/calico-apiserver-697d57c56-9g4vj" Jul 9 13:02:38.389036 kubelet[2942]: I0709 13:02:38.388975 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e21c81c-3cf2-4c7c-a915-e968854971d4-config\") pod \"goldmane-768f4c5c69-2brrp\" (UID: \"8e21c81c-3cf2-4c7c-a915-e968854971d4\") " pod="calico-system/goldmane-768f4c5c69-2brrp" Jul 9 13:02:38.389139 kubelet[2942]: I0709 13:02:38.388984 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64s5h\" (UniqueName: \"kubernetes.io/projected/f89946af-f0dd-4143-a764-f7e14a7dc9ea-kube-api-access-64s5h\") pod \"whisker-5dbc8bf55b-2ptzt\" (UID: \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\") " pod="calico-system/whisker-5dbc8bf55b-2ptzt" Jul 9 13:02:38.389139 kubelet[2942]: I0709 13:02:38.388995 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e2d6563d-91df-4d1a-83da-42d96468fe10-calico-apiserver-certs\") pod \"calico-apiserver-697d57c56-499xq\" (UID: \"e2d6563d-91df-4d1a-83da-42d96468fe10\") " pod="calico-apiserver/calico-apiserver-697d57c56-499xq" Jul 9 13:02:38.389139 kubelet[2942]: I0709 13:02:38.389004 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e21c81c-3cf2-4c7c-a915-e968854971d4-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-2brrp\" (UID: \"8e21c81c-3cf2-4c7c-a915-e968854971d4\") " pod="calico-system/goldmane-768f4c5c69-2brrp" Jul 9 13:02:38.389139 kubelet[2942]: I0709 13:02:38.389020 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-ca-bundle\") pod \"whisker-5dbc8bf55b-2ptzt\" (UID: \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\") " pod="calico-system/whisker-5dbc8bf55b-2ptzt" Jul 9 13:02:38.389139 kubelet[2942]: I0709 13:02:38.389033 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/852b8fad-39a8-4dd8-ae90-8162593e4973-config-volume\") pod \"coredns-668d6bf9bc-4c2t7\" (UID: \"852b8fad-39a8-4dd8-ae90-8162593e4973\") " pod="kube-system/coredns-668d6bf9bc-4c2t7" Jul 9 13:02:38.389244 kubelet[2942]: I0709 13:02:38.389043 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7f4ec8f-6937-4878-a69c-40587e0b905d-config-volume\") pod \"coredns-668d6bf9bc-bpjdp\" (UID: \"e7f4ec8f-6937-4878-a69c-40587e0b905d\") " pod="kube-system/coredns-668d6bf9bc-bpjdp" Jul 9 13:02:38.389244 kubelet[2942]: I0709 13:02:38.389054 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8e21c81c-3cf2-4c7c-a915-e968854971d4-goldmane-key-pair\") pod \"goldmane-768f4c5c69-2brrp\" (UID: \"8e21c81c-3cf2-4c7c-a915-e968854971d4\") " pod="calico-system/goldmane-768f4c5c69-2brrp" Jul 9 13:02:38.389244 kubelet[2942]: I0709 13:02:38.389065 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkjgr\" (UniqueName: \"kubernetes.io/projected/852b8fad-39a8-4dd8-ae90-8162593e4973-kube-api-access-gkjgr\") pod \"coredns-668d6bf9bc-4c2t7\" (UID: \"852b8fad-39a8-4dd8-ae90-8162593e4973\") " pod="kube-system/coredns-668d6bf9bc-4c2t7" Jul 9 13:02:38.389244 kubelet[2942]: I0709 13:02:38.389078 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eabfdb8-41ba-4268-b70a-42c305153461-tigera-ca-bundle\") pod \"calico-kube-controllers-6b449fff5d-5tg2l\" (UID: \"0eabfdb8-41ba-4268-b70a-42c305153461\") " pod="calico-system/calico-kube-controllers-6b449fff5d-5tg2l" Jul 9 13:02:38.389244 kubelet[2942]: I0709 13:02:38.389093 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f526s\" (UniqueName: \"kubernetes.io/projected/e2d6563d-91df-4d1a-83da-42d96468fe10-kube-api-access-f526s\") pod \"calico-apiserver-697d57c56-499xq\" (UID: \"e2d6563d-91df-4d1a-83da-42d96468fe10\") " pod="calico-apiserver/calico-apiserver-697d57c56-499xq" Jul 9 13:02:38.644176 containerd[1624]: time="2025-07-09T13:02:38.644101609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-697d57c56-499xq,Uid:e2d6563d-91df-4d1a-83da-42d96468fe10,Namespace:calico-apiserver,Attempt:0,}" Jul 9 13:02:38.654046 containerd[1624]: time="2025-07-09T13:02:38.654013525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bpjdp,Uid:e7f4ec8f-6937-4878-a69c-40587e0b905d,Namespace:kube-system,Attempt:0,}" Jul 9 13:02:38.682265 containerd[1624]: time="2025-07-09T13:02:38.682213631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-2brrp,Uid:8e21c81c-3cf2-4c7c-a915-e968854971d4,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:38.682340 containerd[1624]: time="2025-07-09T13:02:38.682293658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b449fff5d-5tg2l,Uid:0eabfdb8-41ba-4268-b70a-42c305153461,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:38.689687 containerd[1624]: time="2025-07-09T13:02:38.689515508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4c2t7,Uid:852b8fad-39a8-4dd8-ae90-8162593e4973,Namespace:kube-system,Attempt:0,}" Jul 9 13:02:38.689687 containerd[1624]: time="2025-07-09T13:02:38.689634472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-697d57c56-9g4vj,Uid:385d03ba-5657-4ba7-b837-dbc26e99cf88,Namespace:calico-apiserver,Attempt:0,}" Jul 9 13:02:39.234149 containerd[1624]: time="2025-07-09T13:02:39.234042933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 9 13:02:39.502721 kubelet[2942]: E0709 13:02:39.502554 2942 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jul 9 13:02:39.524119 kubelet[2942]: E0709 13:02:39.524095 2942 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-ca-bundle podName:f89946af-f0dd-4143-a764-f7e14a7dc9ea nodeName:}" failed. No retries permitted until 2025-07-09 13:02:40.00260532 +0000 UTC m=+43.266370831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-ca-bundle") pod "whisker-5dbc8bf55b-2ptzt" (UID: "f89946af-f0dd-4143-a764-f7e14a7dc9ea") : failed to sync configmap cache: timed out waiting for the condition Jul 9 13:02:39.753688 containerd[1624]: time="2025-07-09T13:02:39.753535895Z" level=error msg="Failed to destroy network for sandbox \"404d857ddcfbfe6033b38705ab577078944786e6616cb05ab9e64bafc3178742\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.759544 containerd[1624]: time="2025-07-09T13:02:39.756966783Z" level=error msg="Failed to destroy network for sandbox \"2672215f69ad06f2666efd4ca63052d03262d04b7d0cf68787d9e6d7bc121c43\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.759544 containerd[1624]: time="2025-07-09T13:02:39.759179599Z" level=error msg="Failed to destroy network for sandbox \"2d1aa2d6846dd43142a9a6b3892db3a66fcbfb676018be358acd5240a230f6c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.755427 systemd[1]: run-netns-cni\x2df22ac268\x2d84cd\x2d792b\x2d0ff2\x2da846d1624a8c.mount: Deactivated successfully. Jul 9 13:02:39.758206 systemd[1]: run-netns-cni\x2da6176bb5\x2dbe84\x2d5aca\x2de7e8\x2df10bfa416de1.mount: Deactivated successfully. Jul 9 13:02:39.760577 systemd[1]: run-netns-cni\x2dd686934a\x2d5072\x2d4fe6\x2d9f3b\x2dcaddafbb9ea3.mount: Deactivated successfully. Jul 9 13:02:39.761094 containerd[1624]: time="2025-07-09T13:02:39.760985747Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4c2t7,Uid:852b8fad-39a8-4dd8-ae90-8162593e4973,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"404d857ddcfbfe6033b38705ab577078944786e6616cb05ab9e64bafc3178742\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.764799 containerd[1624]: time="2025-07-09T13:02:39.764779209Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-697d57c56-499xq,Uid:e2d6563d-91df-4d1a-83da-42d96468fe10,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2672215f69ad06f2666efd4ca63052d03262d04b7d0cf68787d9e6d7bc121c43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.768768 kubelet[2942]: E0709 13:02:39.767782 2942 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"404d857ddcfbfe6033b38705ab577078944786e6616cb05ab9e64bafc3178742\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.768768 kubelet[2942]: E0709 13:02:39.767839 2942 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"404d857ddcfbfe6033b38705ab577078944786e6616cb05ab9e64bafc3178742\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-4c2t7" Jul 9 13:02:39.768768 kubelet[2942]: E0709 13:02:39.767853 2942 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"404d857ddcfbfe6033b38705ab577078944786e6616cb05ab9e64bafc3178742\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-4c2t7" Jul 9 13:02:39.768873 containerd[1624]: time="2025-07-09T13:02:39.768126083Z" level=error msg="Failed to destroy network for sandbox \"6e64b20dbe2b259266fa049af6167df6f10de3d6074c7fbc87d46a0b136d3d07\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.768896 kubelet[2942]: E0709 13:02:39.767887 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-4c2t7_kube-system(852b8fad-39a8-4dd8-ae90-8162593e4973)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-4c2t7_kube-system(852b8fad-39a8-4dd8-ae90-8162593e4973)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"404d857ddcfbfe6033b38705ab577078944786e6616cb05ab9e64bafc3178742\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-4c2t7" podUID="852b8fad-39a8-4dd8-ae90-8162593e4973" Jul 9 13:02:39.768896 kubelet[2942]: E0709 13:02:39.768065 2942 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2672215f69ad06f2666efd4ca63052d03262d04b7d0cf68787d9e6d7bc121c43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.768896 kubelet[2942]: E0709 13:02:39.768709 2942 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2672215f69ad06f2666efd4ca63052d03262d04b7d0cf68787d9e6d7bc121c43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-697d57c56-499xq" Jul 9 13:02:39.768980 kubelet[2942]: E0709 13:02:39.768725 2942 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2672215f69ad06f2666efd4ca63052d03262d04b7d0cf68787d9e6d7bc121c43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-697d57c56-499xq" Jul 9 13:02:39.768980 kubelet[2942]: E0709 13:02:39.768746 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-697d57c56-499xq_calico-apiserver(e2d6563d-91df-4d1a-83da-42d96468fe10)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-697d57c56-499xq_calico-apiserver(e2d6563d-91df-4d1a-83da-42d96468fe10)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2672215f69ad06f2666efd4ca63052d03262d04b7d0cf68787d9e6d7bc121c43\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-697d57c56-499xq" podUID="e2d6563d-91df-4d1a-83da-42d96468fe10" Jul 9 13:02:39.769807 containerd[1624]: time="2025-07-09T13:02:39.769786926Z" level=error msg="Failed to destroy network for sandbox \"f489d9d7be728b0839ae302c9e3b8b4bd80ee306f8edaad4348da7acbfdf186a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.770511 systemd[1]: run-netns-cni\x2dbf5b7a75\x2d526c\x2d42b6\x2df8ab\x2d35b1106ebbd6.mount: Deactivated successfully. Jul 9 13:02:39.772003 containerd[1624]: time="2025-07-09T13:02:39.771981930Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-697d57c56-9g4vj,Uid:385d03ba-5657-4ba7-b837-dbc26e99cf88,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d1aa2d6846dd43142a9a6b3892db3a66fcbfb676018be358acd5240a230f6c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.772861 kubelet[2942]: E0709 13:02:39.772743 2942 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d1aa2d6846dd43142a9a6b3892db3a66fcbfb676018be358acd5240a230f6c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.772861 kubelet[2942]: E0709 13:02:39.772790 2942 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d1aa2d6846dd43142a9a6b3892db3a66fcbfb676018be358acd5240a230f6c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-697d57c56-9g4vj" Jul 9 13:02:39.772861 kubelet[2942]: E0709 13:02:39.772805 2942 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d1aa2d6846dd43142a9a6b3892db3a66fcbfb676018be358acd5240a230f6c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-697d57c56-9g4vj" Jul 9 13:02:39.772943 kubelet[2942]: E0709 13:02:39.772838 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-697d57c56-9g4vj_calico-apiserver(385d03ba-5657-4ba7-b837-dbc26e99cf88)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-697d57c56-9g4vj_calico-apiserver(385d03ba-5657-4ba7-b837-dbc26e99cf88)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d1aa2d6846dd43142a9a6b3892db3a66fcbfb676018be358acd5240a230f6c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-697d57c56-9g4vj" podUID="385d03ba-5657-4ba7-b837-dbc26e99cf88" Jul 9 13:02:39.773629 containerd[1624]: time="2025-07-09T13:02:39.773611276Z" level=error msg="Failed to destroy network for sandbox \"4aded1eef238b389373a3975b6ca3df6e6cea4aaca3c004a069d690490baa16b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.776103 containerd[1624]: time="2025-07-09T13:02:39.776079636Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b449fff5d-5tg2l,Uid:0eabfdb8-41ba-4268-b70a-42c305153461,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e64b20dbe2b259266fa049af6167df6f10de3d6074c7fbc87d46a0b136d3d07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.776313 kubelet[2942]: E0709 13:02:39.776178 2942 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e64b20dbe2b259266fa049af6167df6f10de3d6074c7fbc87d46a0b136d3d07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.776313 kubelet[2942]: E0709 13:02:39.776205 2942 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e64b20dbe2b259266fa049af6167df6f10de3d6074c7fbc87d46a0b136d3d07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b449fff5d-5tg2l" Jul 9 13:02:39.776313 kubelet[2942]: E0709 13:02:39.776216 2942 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e64b20dbe2b259266fa049af6167df6f10de3d6074c7fbc87d46a0b136d3d07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b449fff5d-5tg2l" Jul 9 13:02:39.776381 kubelet[2942]: E0709 13:02:39.776239 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6b449fff5d-5tg2l_calico-system(0eabfdb8-41ba-4268-b70a-42c305153461)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6b449fff5d-5tg2l_calico-system(0eabfdb8-41ba-4268-b70a-42c305153461)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6e64b20dbe2b259266fa049af6167df6f10de3d6074c7fbc87d46a0b136d3d07\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6b449fff5d-5tg2l" podUID="0eabfdb8-41ba-4268-b70a-42c305153461" Jul 9 13:02:39.781750 containerd[1624]: time="2025-07-09T13:02:39.780924507Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-2brrp,Uid:8e21c81c-3cf2-4c7c-a915-e968854971d4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f489d9d7be728b0839ae302c9e3b8b4bd80ee306f8edaad4348da7acbfdf186a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.781816 kubelet[2942]: E0709 13:02:39.781420 2942 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f489d9d7be728b0839ae302c9e3b8b4bd80ee306f8edaad4348da7acbfdf186a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.781816 kubelet[2942]: E0709 13:02:39.781535 2942 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f489d9d7be728b0839ae302c9e3b8b4bd80ee306f8edaad4348da7acbfdf186a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-2brrp" Jul 9 13:02:39.781816 kubelet[2942]: E0709 13:02:39.781548 2942 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f489d9d7be728b0839ae302c9e3b8b4bd80ee306f8edaad4348da7acbfdf186a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-2brrp" Jul 9 13:02:39.781884 kubelet[2942]: E0709 13:02:39.781582 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-2brrp_calico-system(8e21c81c-3cf2-4c7c-a915-e968854971d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-2brrp_calico-system(8e21c81c-3cf2-4c7c-a915-e968854971d4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f489d9d7be728b0839ae302c9e3b8b4bd80ee306f8edaad4348da7acbfdf186a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-2brrp" podUID="8e21c81c-3cf2-4c7c-a915-e968854971d4" Jul 9 13:02:39.784990 containerd[1624]: time="2025-07-09T13:02:39.784938113Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bpjdp,Uid:e7f4ec8f-6937-4878-a69c-40587e0b905d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4aded1eef238b389373a3975b6ca3df6e6cea4aaca3c004a069d690490baa16b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.785148 kubelet[2942]: E0709 13:02:39.785129 2942 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4aded1eef238b389373a3975b6ca3df6e6cea4aaca3c004a069d690490baa16b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.785216 kubelet[2942]: E0709 13:02:39.785184 2942 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4aded1eef238b389373a3975b6ca3df6e6cea4aaca3c004a069d690490baa16b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bpjdp" Jul 9 13:02:39.785216 kubelet[2942]: E0709 13:02:39.785210 2942 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4aded1eef238b389373a3975b6ca3df6e6cea4aaca3c004a069d690490baa16b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bpjdp" Jul 9 13:02:39.785266 kubelet[2942]: E0709 13:02:39.785233 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-bpjdp_kube-system(e7f4ec8f-6937-4878-a69c-40587e0b905d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-bpjdp_kube-system(e7f4ec8f-6937-4878-a69c-40587e0b905d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4aded1eef238b389373a3975b6ca3df6e6cea4aaca3c004a069d690490baa16b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-bpjdp" podUID="e7f4ec8f-6937-4878-a69c-40587e0b905d" Jul 9 13:02:39.904924 systemd[1]: Created slice kubepods-besteffort-pod4941ee09_024b_44fc_8ad7_71d776271987.slice - libcontainer container kubepods-besteffort-pod4941ee09_024b_44fc_8ad7_71d776271987.slice. Jul 9 13:02:39.906665 containerd[1624]: time="2025-07-09T13:02:39.906645139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nhvjr,Uid:4941ee09-024b-44fc-8ad7-71d776271987,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:39.948453 containerd[1624]: time="2025-07-09T13:02:39.948417511Z" level=error msg="Failed to destroy network for sandbox \"7ae551bc5f959ac4eebb2de0af10be7f5102c69e11120bef7cd60aa3e4c18698\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.956042 containerd[1624]: time="2025-07-09T13:02:39.956012794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nhvjr,Uid:4941ee09-024b-44fc-8ad7-71d776271987,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ae551bc5f959ac4eebb2de0af10be7f5102c69e11120bef7cd60aa3e4c18698\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.956216 kubelet[2942]: E0709 13:02:39.956198 2942 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ae551bc5f959ac4eebb2de0af10be7f5102c69e11120bef7cd60aa3e4c18698\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:39.956469 kubelet[2942]: E0709 13:02:39.956278 2942 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ae551bc5f959ac4eebb2de0af10be7f5102c69e11120bef7cd60aa3e4c18698\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nhvjr" Jul 9 13:02:39.956469 kubelet[2942]: E0709 13:02:39.956293 2942 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ae551bc5f959ac4eebb2de0af10be7f5102c69e11120bef7cd60aa3e4c18698\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nhvjr" Jul 9 13:02:39.956469 kubelet[2942]: E0709 13:02:39.956321 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-nhvjr_calico-system(4941ee09-024b-44fc-8ad7-71d776271987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-nhvjr_calico-system(4941ee09-024b-44fc-8ad7-71d776271987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7ae551bc5f959ac4eebb2de0af10be7f5102c69e11120bef7cd60aa3e4c18698\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-nhvjr" podUID="4941ee09-024b-44fc-8ad7-71d776271987" Jul 9 13:02:40.013434 systemd[1]: run-netns-cni\x2d13d9f668\x2d2029\x2dfdd1\x2d8937\x2d4f91b3662b95.mount: Deactivated successfully. Jul 9 13:02:40.013492 systemd[1]: run-netns-cni\x2d365ed19c\x2d0dcf\x2dace6\x2d7ad9\x2d9cf2f6316eff.mount: Deactivated successfully. Jul 9 13:02:40.145283 containerd[1624]: time="2025-07-09T13:02:40.145240394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dbc8bf55b-2ptzt,Uid:f89946af-f0dd-4143-a764-f7e14a7dc9ea,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:40.185563 containerd[1624]: time="2025-07-09T13:02:40.185530041Z" level=error msg="Failed to destroy network for sandbox \"7fef4f239e894f1f4b32f25edf9a15a6da6d528ba002084556fcd3b64a04a833\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:40.186922 systemd[1]: run-netns-cni\x2d9a1bc07d\x2d684d\x2dc0ba\x2d3806\x2d85ec4b785e11.mount: Deactivated successfully. Jul 9 13:02:40.190928 containerd[1624]: time="2025-07-09T13:02:40.190897822Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dbc8bf55b-2ptzt,Uid:f89946af-f0dd-4143-a764-f7e14a7dc9ea,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fef4f239e894f1f4b32f25edf9a15a6da6d528ba002084556fcd3b64a04a833\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:40.191158 kubelet[2942]: E0709 13:02:40.191132 2942 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fef4f239e894f1f4b32f25edf9a15a6da6d528ba002084556fcd3b64a04a833\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 13:02:40.191277 kubelet[2942]: E0709 13:02:40.191199 2942 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fef4f239e894f1f4b32f25edf9a15a6da6d528ba002084556fcd3b64a04a833\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5dbc8bf55b-2ptzt" Jul 9 13:02:40.191277 kubelet[2942]: E0709 13:02:40.191214 2942 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fef4f239e894f1f4b32f25edf9a15a6da6d528ba002084556fcd3b64a04a833\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5dbc8bf55b-2ptzt" Jul 9 13:02:40.191346 kubelet[2942]: E0709 13:02:40.191333 2942 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5dbc8bf55b-2ptzt_calico-system(f89946af-f0dd-4143-a764-f7e14a7dc9ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5dbc8bf55b-2ptzt_calico-system(f89946af-f0dd-4143-a764-f7e14a7dc9ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7fef4f239e894f1f4b32f25edf9a15a6da6d528ba002084556fcd3b64a04a833\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5dbc8bf55b-2ptzt" podUID="f89946af-f0dd-4143-a764-f7e14a7dc9ea" Jul 9 13:02:50.106886 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1747832826.mount: Deactivated successfully. Jul 9 13:02:50.361947 containerd[1624]: time="2025-07-09T13:02:50.361768777Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Jul 9 13:02:50.362587 containerd[1624]: time="2025-07-09T13:02:50.351848580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:50.375219 containerd[1624]: time="2025-07-09T13:02:50.375161407Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:50.415028 containerd[1624]: time="2025-07-09T13:02:50.414858760Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:50.416495 containerd[1624]: time="2025-07-09T13:02:50.416472310Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 11.18055093s" Jul 9 13:02:50.416534 containerd[1624]: time="2025-07-09T13:02:50.416495378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Jul 9 13:02:50.462076 containerd[1624]: time="2025-07-09T13:02:50.461982210Z" level=info msg="CreateContainer within sandbox \"b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 9 13:02:50.551663 containerd[1624]: time="2025-07-09T13:02:50.550901474Z" level=info msg="Container e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:50.551943 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount936854984.mount: Deactivated successfully. Jul 9 13:02:50.577089 containerd[1624]: time="2025-07-09T13:02:50.577061915Z" level=info msg="CreateContainer within sandbox \"b87d8a326b1791202cdf041daca2fb66884f6815a4232899099c72d00e15040d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f\"" Jul 9 13:02:50.577893 containerd[1624]: time="2025-07-09T13:02:50.577877837Z" level=info msg="StartContainer for \"e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f\"" Jul 9 13:02:50.585371 containerd[1624]: time="2025-07-09T13:02:50.585352407Z" level=info msg="connecting to shim e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f" address="unix:///run/containerd/s/b088aaf54cba39f0c3369a7d1cb6ff43992596cccd0b181b5208be8be740f28c" protocol=ttrpc version=3 Jul 9 13:02:50.618872 systemd[1]: Started cri-containerd-e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f.scope - libcontainer container e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f. Jul 9 13:02:50.652830 containerd[1624]: time="2025-07-09T13:02:50.652780526Z" level=info msg="StartContainer for \"e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f\" returns successfully" Jul 9 13:02:50.730715 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 9 13:02:50.743284 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 9 13:02:50.906589 containerd[1624]: time="2025-07-09T13:02:50.906516745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nhvjr,Uid:4941ee09-024b-44fc-8ad7-71d776271987,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:51.074974 kubelet[2942]: I0709 13:02:51.074944 2942 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-backend-key-pair\") pod \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\" (UID: \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\") " Jul 9 13:02:51.074974 kubelet[2942]: I0709 13:02:51.074974 2942 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64s5h\" (UniqueName: \"kubernetes.io/projected/f89946af-f0dd-4143-a764-f7e14a7dc9ea-kube-api-access-64s5h\") pod \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\" (UID: \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\") " Jul 9 13:02:51.075572 kubelet[2942]: I0709 13:02:51.074996 2942 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-ca-bundle\") pod \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\" (UID: \"f89946af-f0dd-4143-a764-f7e14a7dc9ea\") " Jul 9 13:02:51.075572 kubelet[2942]: I0709 13:02:51.075236 2942 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "f89946af-f0dd-4143-a764-f7e14a7dc9ea" (UID: "f89946af-f0dd-4143-a764-f7e14a7dc9ea"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jul 9 13:02:51.079634 kubelet[2942]: I0709 13:02:51.079588 2942 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "f89946af-f0dd-4143-a764-f7e14a7dc9ea" (UID: "f89946af-f0dd-4143-a764-f7e14a7dc9ea"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 9 13:02:51.079763 kubelet[2942]: I0709 13:02:51.079746 2942 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89946af-f0dd-4143-a764-f7e14a7dc9ea-kube-api-access-64s5h" (OuterVolumeSpecName: "kube-api-access-64s5h") pod "f89946af-f0dd-4143-a764-f7e14a7dc9ea" (UID: "f89946af-f0dd-4143-a764-f7e14a7dc9ea"). InnerVolumeSpecName "kube-api-access-64s5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 9 13:02:51.108537 systemd[1]: var-lib-kubelet-pods-f89946af\x2df0dd\x2d4143\x2da764\x2df7e14a7dc9ea-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d64s5h.mount: Deactivated successfully. Jul 9 13:02:51.108594 systemd[1]: var-lib-kubelet-pods-f89946af\x2df0dd\x2d4143\x2da764\x2df7e14a7dc9ea-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 9 13:02:51.176180 kubelet[2942]: I0709 13:02:51.176088 2942 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Jul 9 13:02:51.176180 kubelet[2942]: I0709 13:02:51.176111 2942 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-64s5h\" (UniqueName: \"kubernetes.io/projected/f89946af-f0dd-4143-a764-f7e14a7dc9ea-kube-api-access-64s5h\") on node \"localhost\" DevicePath \"\"" Jul 9 13:02:51.176180 kubelet[2942]: I0709 13:02:51.176116 2942 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f89946af-f0dd-4143-a764-f7e14a7dc9ea-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jul 9 13:02:51.264910 systemd[1]: Removed slice kubepods-besteffort-podf89946af_f0dd_4143_a764_f7e14a7dc9ea.slice - libcontainer container kubepods-besteffort-podf89946af_f0dd_4143_a764_f7e14a7dc9ea.slice. Jul 9 13:02:51.281302 kubelet[2942]: I0709 13:02:51.281153 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-8gbzx" podStartSLOduration=1.5010955419999998 podStartE2EDuration="33.275369815s" podCreationTimestamp="2025-07-09 13:02:18 +0000 UTC" firstStartedPulling="2025-07-09 13:02:18.642704744 +0000 UTC m=+21.906470254" lastFinishedPulling="2025-07-09 13:02:50.416979017 +0000 UTC m=+53.680744527" observedRunningTime="2025-07-09 13:02:51.274723061 +0000 UTC m=+54.538488580" watchObservedRunningTime="2025-07-09 13:02:51.275369815 +0000 UTC m=+54.539135329" Jul 9 13:02:51.349179 systemd-networkd[1509]: calic7941e52bb6: Link UP Jul 9 13:02:51.350519 systemd-networkd[1509]: calic7941e52bb6: Gained carrier Jul 9 13:02:51.396471 systemd[1]: Created slice kubepods-besteffort-pod8347ca20_b4e7_48ae_b374_3c0bb8a248a9.slice - libcontainer container kubepods-besteffort-pod8347ca20_b4e7_48ae_b374_3c0bb8a248a9.slice. Jul 9 13:02:51.398326 containerd[1624]: 2025-07-09 13:02:50.936 [INFO][4082] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 13:02:51.398326 containerd[1624]: 2025-07-09 13:02:50.981 [INFO][4082] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--nhvjr-eth0 csi-node-driver- calico-system 4941ee09-024b-44fc-8ad7-71d776271987 726 0 2025-07-09 13:02:18 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-nhvjr eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic7941e52bb6 [] [] }} ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Namespace="calico-system" Pod="csi-node-driver-nhvjr" WorkloadEndpoint="localhost-k8s-csi--node--driver--nhvjr-" Jul 9 13:02:51.398326 containerd[1624]: 2025-07-09 13:02:50.981 [INFO][4082] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Namespace="calico-system" Pod="csi-node-driver-nhvjr" WorkloadEndpoint="localhost-k8s-csi--node--driver--nhvjr-eth0" Jul 9 13:02:51.398326 containerd[1624]: 2025-07-09 13:02:51.230 [INFO][4093] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" HandleID="k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Workload="localhost-k8s-csi--node--driver--nhvjr-eth0" Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.244 [INFO][4093] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" HandleID="k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Workload="localhost-k8s-csi--node--driver--nhvjr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000028160), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-nhvjr", "timestamp":"2025-07-09 13:02:51.230329406 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.244 [INFO][4093] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.245 [INFO][4093] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.246 [INFO][4093] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.269 [INFO][4093] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" host="localhost" Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.294 [INFO][4093] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.300 [INFO][4093] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.301 [INFO][4093] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.305 [INFO][4093] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:51.398559 containerd[1624]: 2025-07-09 13:02:51.305 [INFO][4093] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" host="localhost" Jul 9 13:02:51.398981 containerd[1624]: 2025-07-09 13:02:51.308 [INFO][4093] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740 Jul 9 13:02:51.398981 containerd[1624]: 2025-07-09 13:02:51.313 [INFO][4093] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" host="localhost" Jul 9 13:02:51.398981 containerd[1624]: 2025-07-09 13:02:51.324 [INFO][4093] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" host="localhost" Jul 9 13:02:51.398981 containerd[1624]: 2025-07-09 13:02:51.324 [INFO][4093] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" host="localhost" Jul 9 13:02:51.398981 containerd[1624]: 2025-07-09 13:02:51.324 [INFO][4093] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 13:02:51.398981 containerd[1624]: 2025-07-09 13:02:51.324 [INFO][4093] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" HandleID="k8s-pod-network.250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Workload="localhost-k8s-csi--node--driver--nhvjr-eth0" Jul 9 13:02:51.399080 containerd[1624]: 2025-07-09 13:02:51.328 [INFO][4082] cni-plugin/k8s.go 418: Populated endpoint ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Namespace="calico-system" Pod="csi-node-driver-nhvjr" WorkloadEndpoint="localhost-k8s-csi--node--driver--nhvjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--nhvjr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4941ee09-024b-44fc-8ad7-71d776271987", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-nhvjr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic7941e52bb6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:51.399242 containerd[1624]: 2025-07-09 13:02:51.328 [INFO][4082] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Namespace="calico-system" Pod="csi-node-driver-nhvjr" WorkloadEndpoint="localhost-k8s-csi--node--driver--nhvjr-eth0" Jul 9 13:02:51.399242 containerd[1624]: 2025-07-09 13:02:51.328 [INFO][4082] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic7941e52bb6 ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Namespace="calico-system" Pod="csi-node-driver-nhvjr" WorkloadEndpoint="localhost-k8s-csi--node--driver--nhvjr-eth0" Jul 9 13:02:51.399242 containerd[1624]: 2025-07-09 13:02:51.353 [INFO][4082] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Namespace="calico-system" Pod="csi-node-driver-nhvjr" WorkloadEndpoint="localhost-k8s-csi--node--driver--nhvjr-eth0" Jul 9 13:02:51.399296 containerd[1624]: 2025-07-09 13:02:51.354 [INFO][4082] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Namespace="calico-system" Pod="csi-node-driver-nhvjr" WorkloadEndpoint="localhost-k8s-csi--node--driver--nhvjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--nhvjr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4941ee09-024b-44fc-8ad7-71d776271987", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740", Pod:"csi-node-driver-nhvjr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic7941e52bb6", MAC:"72:b4:c6:dc:88:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:51.399348 containerd[1624]: 2025-07-09 13:02:51.374 [INFO][4082] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" Namespace="calico-system" Pod="csi-node-driver-nhvjr" WorkloadEndpoint="localhost-k8s-csi--node--driver--nhvjr-eth0" Jul 9 13:02:51.478994 kubelet[2942]: I0709 13:02:51.478958 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8347ca20-b4e7-48ae-b374-3c0bb8a248a9-whisker-ca-bundle\") pod \"whisker-55c7b9bd44-qgllz\" (UID: \"8347ca20-b4e7-48ae-b374-3c0bb8a248a9\") " pod="calico-system/whisker-55c7b9bd44-qgllz" Jul 9 13:02:51.478994 kubelet[2942]: I0709 13:02:51.478998 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw8hq\" (UniqueName: \"kubernetes.io/projected/8347ca20-b4e7-48ae-b374-3c0bb8a248a9-kube-api-access-jw8hq\") pod \"whisker-55c7b9bd44-qgllz\" (UID: \"8347ca20-b4e7-48ae-b374-3c0bb8a248a9\") " pod="calico-system/whisker-55c7b9bd44-qgllz" Jul 9 13:02:51.479108 kubelet[2942]: I0709 13:02:51.479011 2942 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8347ca20-b4e7-48ae-b374-3c0bb8a248a9-whisker-backend-key-pair\") pod \"whisker-55c7b9bd44-qgllz\" (UID: \"8347ca20-b4e7-48ae-b374-3c0bb8a248a9\") " pod="calico-system/whisker-55c7b9bd44-qgllz" Jul 9 13:02:51.599511 containerd[1624]: time="2025-07-09T13:02:51.599000020Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f\" id:\"3577bb920fac2585c92afe04649d4678a434bf4a90910298d51e39edd0b38b12\" pid:4132 exit_status:1 exited_at:{seconds:1752066171 nanos:598663055}" Jul 9 13:02:51.628772 containerd[1624]: time="2025-07-09T13:02:51.628739311Z" level=info msg="connecting to shim 250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740" address="unix:///run/containerd/s/3a910947b21a77117dd186f194ead8f8e1e8626f6c4e9f079e8decd152c71caa" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:51.648764 systemd[1]: Started cri-containerd-250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740.scope - libcontainer container 250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740. Jul 9 13:02:51.656466 systemd-resolved[1511]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 9 13:02:51.668366 containerd[1624]: time="2025-07-09T13:02:51.668346728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nhvjr,Uid:4941ee09-024b-44fc-8ad7-71d776271987,Namespace:calico-system,Attempt:0,} returns sandbox id \"250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740\"" Jul 9 13:02:51.669496 containerd[1624]: time="2025-07-09T13:02:51.669481734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 9 13:02:51.709775 containerd[1624]: time="2025-07-09T13:02:51.709730940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-55c7b9bd44-qgllz,Uid:8347ca20-b4e7-48ae-b374-3c0bb8a248a9,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:51.775809 systemd-networkd[1509]: cali93659657596: Link UP Jul 9 13:02:51.775915 systemd-networkd[1509]: cali93659657596: Gained carrier Jul 9 13:02:51.784728 containerd[1624]: 2025-07-09 13:02:51.735 [INFO][4191] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 13:02:51.784728 containerd[1624]: 2025-07-09 13:02:51.742 [INFO][4191] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--55c7b9bd44--qgllz-eth0 whisker-55c7b9bd44- calico-system 8347ca20-b4e7-48ae-b374-3c0bb8a248a9 942 0 2025-07-09 13:02:51 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:55c7b9bd44 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-55c7b9bd44-qgllz eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali93659657596 [] [] }} ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Namespace="calico-system" Pod="whisker-55c7b9bd44-qgllz" WorkloadEndpoint="localhost-k8s-whisker--55c7b9bd44--qgllz-" Jul 9 13:02:51.784728 containerd[1624]: 2025-07-09 13:02:51.742 [INFO][4191] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Namespace="calico-system" Pod="whisker-55c7b9bd44-qgllz" WorkloadEndpoint="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" Jul 9 13:02:51.784728 containerd[1624]: 2025-07-09 13:02:51.755 [INFO][4204] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" HandleID="k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Workload="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.755 [INFO][4204] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" HandleID="k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Workload="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-55c7b9bd44-qgllz", "timestamp":"2025-07-09 13:02:51.755449279 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.755 [INFO][4204] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.755 [INFO][4204] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.755 [INFO][4204] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.760 [INFO][4204] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" host="localhost" Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.762 [INFO][4204] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.765 [INFO][4204] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.766 [INFO][4204] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.767 [INFO][4204] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:51.784879 containerd[1624]: 2025-07-09 13:02:51.767 [INFO][4204] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" host="localhost" Jul 9 13:02:51.785943 containerd[1624]: 2025-07-09 13:02:51.768 [INFO][4204] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71 Jul 9 13:02:51.785943 containerd[1624]: 2025-07-09 13:02:51.770 [INFO][4204] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" host="localhost" Jul 9 13:02:51.785943 containerd[1624]: 2025-07-09 13:02:51.773 [INFO][4204] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" host="localhost" Jul 9 13:02:51.785943 containerd[1624]: 2025-07-09 13:02:51.773 [INFO][4204] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" host="localhost" Jul 9 13:02:51.785943 containerd[1624]: 2025-07-09 13:02:51.773 [INFO][4204] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 13:02:51.785943 containerd[1624]: 2025-07-09 13:02:51.773 [INFO][4204] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" HandleID="k8s-pod-network.e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Workload="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" Jul 9 13:02:51.786208 containerd[1624]: 2025-07-09 13:02:51.775 [INFO][4191] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Namespace="calico-system" Pod="whisker-55c7b9bd44-qgllz" WorkloadEndpoint="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--55c7b9bd44--qgllz-eth0", GenerateName:"whisker-55c7b9bd44-", Namespace:"calico-system", SelfLink:"", UID:"8347ca20-b4e7-48ae-b374-3c0bb8a248a9", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"55c7b9bd44", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-55c7b9bd44-qgllz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali93659657596", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:51.786208 containerd[1624]: 2025-07-09 13:02:51.775 [INFO][4191] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Namespace="calico-system" Pod="whisker-55c7b9bd44-qgllz" WorkloadEndpoint="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" Jul 9 13:02:51.786264 containerd[1624]: 2025-07-09 13:02:51.775 [INFO][4191] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali93659657596 ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Namespace="calico-system" Pod="whisker-55c7b9bd44-qgllz" WorkloadEndpoint="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" Jul 9 13:02:51.786264 containerd[1624]: 2025-07-09 13:02:51.776 [INFO][4191] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Namespace="calico-system" Pod="whisker-55c7b9bd44-qgllz" WorkloadEndpoint="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" Jul 9 13:02:51.786316 containerd[1624]: 2025-07-09 13:02:51.776 [INFO][4191] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Namespace="calico-system" Pod="whisker-55c7b9bd44-qgllz" WorkloadEndpoint="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--55c7b9bd44--qgllz-eth0", GenerateName:"whisker-55c7b9bd44-", Namespace:"calico-system", SelfLink:"", UID:"8347ca20-b4e7-48ae-b374-3c0bb8a248a9", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"55c7b9bd44", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71", Pod:"whisker-55c7b9bd44-qgllz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali93659657596", MAC:"de:08:8e:6e:26:7e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:51.786354 containerd[1624]: 2025-07-09 13:02:51.780 [INFO][4191] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" Namespace="calico-system" Pod="whisker-55c7b9bd44-qgllz" WorkloadEndpoint="localhost-k8s-whisker--55c7b9bd44--qgllz-eth0" Jul 9 13:02:51.794779 containerd[1624]: time="2025-07-09T13:02:51.794636243Z" level=info msg="connecting to shim e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71" address="unix:///run/containerd/s/9fcb92d4052e6c5a8523efae7a892a4dc08b540cbb6b3423838347dc165b0afd" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:51.812763 systemd[1]: Started cri-containerd-e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71.scope - libcontainer container e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71. Jul 9 13:02:51.819521 systemd-resolved[1511]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 9 13:02:51.843237 containerd[1624]: time="2025-07-09T13:02:51.843207546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-55c7b9bd44-qgllz,Uid:8347ca20-b4e7-48ae-b374-3c0bb8a248a9,Namespace:calico-system,Attempt:0,} returns sandbox id \"e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71\"" Jul 9 13:02:51.901808 containerd[1624]: time="2025-07-09T13:02:51.901782529Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-2brrp,Uid:8e21c81c-3cf2-4c7c-a915-e968854971d4,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:51.959492 systemd-networkd[1509]: calid97f043feea: Link UP Jul 9 13:02:51.960100 systemd-networkd[1509]: calid97f043feea: Gained carrier Jul 9 13:02:51.968979 containerd[1624]: 2025-07-09 13:02:51.916 [INFO][4261] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 13:02:51.968979 containerd[1624]: 2025-07-09 13:02:51.922 [INFO][4261] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--768f4c5c69--2brrp-eth0 goldmane-768f4c5c69- calico-system 8e21c81c-3cf2-4c7c-a915-e968854971d4 865 0 2025-07-09 13:02:17 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-768f4c5c69-2brrp eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid97f043feea [] [] }} ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Namespace="calico-system" Pod="goldmane-768f4c5c69-2brrp" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--2brrp-" Jul 9 13:02:51.968979 containerd[1624]: 2025-07-09 13:02:51.922 [INFO][4261] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Namespace="calico-system" Pod="goldmane-768f4c5c69-2brrp" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" Jul 9 13:02:51.968979 containerd[1624]: 2025-07-09 13:02:51.938 [INFO][4274] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" HandleID="k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Workload="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.938 [INFO][4274] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" HandleID="k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Workload="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-768f4c5c69-2brrp", "timestamp":"2025-07-09 13:02:51.938281957 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.938 [INFO][4274] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.938 [INFO][4274] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.938 [INFO][4274] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.942 [INFO][4274] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" host="localhost" Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.945 [INFO][4274] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.947 [INFO][4274] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.948 [INFO][4274] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.949 [INFO][4274] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:51.969206 containerd[1624]: 2025-07-09 13:02:51.949 [INFO][4274] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" host="localhost" Jul 9 13:02:51.970105 containerd[1624]: 2025-07-09 13:02:51.949 [INFO][4274] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc Jul 9 13:02:51.970105 containerd[1624]: 2025-07-09 13:02:51.952 [INFO][4274] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" host="localhost" Jul 9 13:02:51.970105 containerd[1624]: 2025-07-09 13:02:51.954 [INFO][4274] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" host="localhost" Jul 9 13:02:51.970105 containerd[1624]: 2025-07-09 13:02:51.955 [INFO][4274] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" host="localhost" Jul 9 13:02:51.970105 containerd[1624]: 2025-07-09 13:02:51.955 [INFO][4274] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 13:02:51.970105 containerd[1624]: 2025-07-09 13:02:51.955 [INFO][4274] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" HandleID="k8s-pod-network.803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Workload="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" Jul 9 13:02:51.970497 containerd[1624]: 2025-07-09 13:02:51.956 [INFO][4261] cni-plugin/k8s.go 418: Populated endpoint ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Namespace="calico-system" Pod="goldmane-768f4c5c69-2brrp" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--768f4c5c69--2brrp-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"8e21c81c-3cf2-4c7c-a915-e968854971d4", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-768f4c5c69-2brrp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid97f043feea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:51.970497 containerd[1624]: 2025-07-09 13:02:51.956 [INFO][4261] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Namespace="calico-system" Pod="goldmane-768f4c5c69-2brrp" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" Jul 9 13:02:51.970582 containerd[1624]: 2025-07-09 13:02:51.956 [INFO][4261] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid97f043feea ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Namespace="calico-system" Pod="goldmane-768f4c5c69-2brrp" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" Jul 9 13:02:51.970582 containerd[1624]: 2025-07-09 13:02:51.960 [INFO][4261] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Namespace="calico-system" Pod="goldmane-768f4c5c69-2brrp" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" Jul 9 13:02:51.970637 containerd[1624]: 2025-07-09 13:02:51.960 [INFO][4261] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Namespace="calico-system" Pod="goldmane-768f4c5c69-2brrp" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--768f4c5c69--2brrp-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"8e21c81c-3cf2-4c7c-a915-e968854971d4", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc", Pod:"goldmane-768f4c5c69-2brrp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid97f043feea", MAC:"6e:34:8d:00:34:b3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:51.970709 containerd[1624]: 2025-07-09 13:02:51.966 [INFO][4261] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" Namespace="calico-system" Pod="goldmane-768f4c5c69-2brrp" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--2brrp-eth0" Jul 9 13:02:52.008710 containerd[1624]: time="2025-07-09T13:02:52.008623695Z" level=info msg="connecting to shim 803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc" address="unix:///run/containerd/s/db40ece796bf3da76bb06d714bfd41395d976e70bc26a3cc3b51f858a93d3229" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:52.031771 systemd[1]: Started cri-containerd-803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc.scope - libcontainer container 803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc. Jul 9 13:02:52.046216 systemd-resolved[1511]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 9 13:02:52.092689 containerd[1624]: time="2025-07-09T13:02:52.092237986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-2brrp,Uid:8e21c81c-3cf2-4c7c-a915-e968854971d4,Namespace:calico-system,Attempt:0,} returns sandbox id \"803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc\"" Jul 9 13:02:52.339730 containerd[1624]: time="2025-07-09T13:02:52.337866789Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f\" id:\"d389adc2abe12c538aead968fec0fe4ce60b689f19a63d554361cdba605dcb8b\" pid:4441 exit_status:1 exited_at:{seconds:1752066172 nanos:337637225}" Jul 9 13:02:52.552635 systemd-networkd[1509]: vxlan.calico: Link UP Jul 9 13:02:52.552639 systemd-networkd[1509]: vxlan.calico: Gained carrier Jul 9 13:02:52.902912 containerd[1624]: time="2025-07-09T13:02:52.902559178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-697d57c56-9g4vj,Uid:385d03ba-5657-4ba7-b837-dbc26e99cf88,Namespace:calico-apiserver,Attempt:0,}" Jul 9 13:02:52.903975 containerd[1624]: time="2025-07-09T13:02:52.902936436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b449fff5d-5tg2l,Uid:0eabfdb8-41ba-4268-b70a-42c305153461,Namespace:calico-system,Attempt:0,}" Jul 9 13:02:52.903975 containerd[1624]: time="2025-07-09T13:02:52.903076366Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-697d57c56-499xq,Uid:e2d6563d-91df-4d1a-83da-42d96468fe10,Namespace:calico-apiserver,Attempt:0,}" Jul 9 13:02:52.907624 kubelet[2942]: I0709 13:02:52.904019 2942 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89946af-f0dd-4143-a764-f7e14a7dc9ea" path="/var/lib/kubelet/pods/f89946af-f0dd-4143-a764-f7e14a7dc9ea/volumes" Jul 9 13:02:53.012065 systemd-networkd[1509]: cali24e85249a01: Link UP Jul 9 13:02:53.012794 systemd-networkd[1509]: cali24e85249a01: Gained carrier Jul 9 13:02:53.023022 containerd[1624]: 2025-07-09 13:02:52.951 [INFO][4579] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--697d57c56--499xq-eth0 calico-apiserver-697d57c56- calico-apiserver e2d6563d-91df-4d1a-83da-42d96468fe10 856 0 2025-07-09 13:02:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:697d57c56 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-697d57c56-499xq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali24e85249a01 [] [] }} ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-499xq" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--499xq-" Jul 9 13:02:53.023022 containerd[1624]: 2025-07-09 13:02:52.951 [INFO][4579] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-499xq" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" Jul 9 13:02:53.023022 containerd[1624]: 2025-07-09 13:02:52.981 [INFO][4606] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" HandleID="k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Workload="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.981 [INFO][4606] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" HandleID="k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Workload="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f8a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-697d57c56-499xq", "timestamp":"2025-07-09 13:02:52.981498935 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.981 [INFO][4606] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.981 [INFO][4606] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.981 [INFO][4606] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.987 [INFO][4606] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" host="localhost" Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.991 [INFO][4606] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.995 [INFO][4606] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.997 [INFO][4606] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.999 [INFO][4606] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:53.023360 containerd[1624]: 2025-07-09 13:02:52.999 [INFO][4606] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" host="localhost" Jul 9 13:02:53.025429 containerd[1624]: 2025-07-09 13:02:53.001 [INFO][4606] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560 Jul 9 13:02:53.025429 containerd[1624]: 2025-07-09 13:02:53.003 [INFO][4606] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" host="localhost" Jul 9 13:02:53.025429 containerd[1624]: 2025-07-09 13:02:53.007 [INFO][4606] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" host="localhost" Jul 9 13:02:53.025429 containerd[1624]: 2025-07-09 13:02:53.007 [INFO][4606] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" host="localhost" Jul 9 13:02:53.025429 containerd[1624]: 2025-07-09 13:02:53.007 [INFO][4606] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 13:02:53.025429 containerd[1624]: 2025-07-09 13:02:53.007 [INFO][4606] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" HandleID="k8s-pod-network.6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Workload="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" Jul 9 13:02:53.025744 containerd[1624]: 2025-07-09 13:02:53.009 [INFO][4579] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-499xq" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--697d57c56--499xq-eth0", GenerateName:"calico-apiserver-697d57c56-", Namespace:"calico-apiserver", SelfLink:"", UID:"e2d6563d-91df-4d1a-83da-42d96468fe10", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"697d57c56", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-697d57c56-499xq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali24e85249a01", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:53.025834 containerd[1624]: 2025-07-09 13:02:53.010 [INFO][4579] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-499xq" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" Jul 9 13:02:53.025834 containerd[1624]: 2025-07-09 13:02:53.010 [INFO][4579] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24e85249a01 ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-499xq" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" Jul 9 13:02:53.025834 containerd[1624]: 2025-07-09 13:02:53.013 [INFO][4579] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-499xq" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" Jul 9 13:02:53.025905 containerd[1624]: 2025-07-09 13:02:53.013 [INFO][4579] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-499xq" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--697d57c56--499xq-eth0", GenerateName:"calico-apiserver-697d57c56-", Namespace:"calico-apiserver", SelfLink:"", UID:"e2d6563d-91df-4d1a-83da-42d96468fe10", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"697d57c56", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560", Pod:"calico-apiserver-697d57c56-499xq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali24e85249a01", MAC:"f2:3a:c0:54:3b:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:53.025959 containerd[1624]: 2025-07-09 13:02:53.019 [INFO][4579] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-499xq" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--499xq-eth0" Jul 9 13:02:53.037281 containerd[1624]: time="2025-07-09T13:02:53.037235581Z" level=info msg="connecting to shim 6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560" address="unix:///run/containerd/s/b1c1efba5a728f71bbfb8b833f9c2f917377afce40192230b4131b52a0c04cea" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:53.059821 systemd[1]: Started cri-containerd-6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560.scope - libcontainer container 6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560. Jul 9 13:02:53.067361 systemd-resolved[1511]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 9 13:02:53.068141 systemd-networkd[1509]: cali93659657596: Gained IPv6LL Jul 9 13:02:53.093920 containerd[1624]: time="2025-07-09T13:02:53.093853297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-697d57c56-499xq,Uid:e2d6563d-91df-4d1a-83da-42d96468fe10,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560\"" Jul 9 13:02:53.116938 systemd-networkd[1509]: cali965d39b4d82: Link UP Jul 9 13:02:53.117453 systemd-networkd[1509]: cali965d39b4d82: Gained carrier Jul 9 13:02:53.128939 containerd[1624]: 2025-07-09 13:02:52.940 [INFO][4564] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0 calico-apiserver-697d57c56- calico-apiserver 385d03ba-5657-4ba7-b837-dbc26e99cf88 863 0 2025-07-09 13:02:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:697d57c56 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-697d57c56-9g4vj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali965d39b4d82 [] [] }} ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-9g4vj" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--9g4vj-" Jul 9 13:02:53.128939 containerd[1624]: 2025-07-09 13:02:52.940 [INFO][4564] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-9g4vj" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" Jul 9 13:02:53.128939 containerd[1624]: 2025-07-09 13:02:52.987 [INFO][4598] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" HandleID="k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Workload="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:52.989 [INFO][4598] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" HandleID="k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Workload="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd740), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-697d57c56-9g4vj", "timestamp":"2025-07-09 13:02:52.987821633 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:52.989 [INFO][4598] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:53.007 [INFO][4598] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:53.007 [INFO][4598] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:53.087 [INFO][4598] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" host="localhost" Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:53.090 [INFO][4598] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:53.098 [INFO][4598] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:53.099 [INFO][4598] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:53.100 [INFO][4598] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:53.129110 containerd[1624]: 2025-07-09 13:02:53.100 [INFO][4598] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" host="localhost" Jul 9 13:02:53.135888 containerd[1624]: 2025-07-09 13:02:53.101 [INFO][4598] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253 Jul 9 13:02:53.135888 containerd[1624]: 2025-07-09 13:02:53.106 [INFO][4598] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" host="localhost" Jul 9 13:02:53.135888 containerd[1624]: 2025-07-09 13:02:53.112 [INFO][4598] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" host="localhost" Jul 9 13:02:53.135888 containerd[1624]: 2025-07-09 13:02:53.112 [INFO][4598] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" host="localhost" Jul 9 13:02:53.135888 containerd[1624]: 2025-07-09 13:02:53.112 [INFO][4598] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 13:02:53.135888 containerd[1624]: 2025-07-09 13:02:53.112 [INFO][4598] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" HandleID="k8s-pod-network.3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Workload="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" Jul 9 13:02:53.135991 containerd[1624]: 2025-07-09 13:02:53.115 [INFO][4564] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-9g4vj" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0", GenerateName:"calico-apiserver-697d57c56-", Namespace:"calico-apiserver", SelfLink:"", UID:"385d03ba-5657-4ba7-b837-dbc26e99cf88", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"697d57c56", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-697d57c56-9g4vj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali965d39b4d82", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:53.136042 containerd[1624]: 2025-07-09 13:02:53.115 [INFO][4564] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-9g4vj" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" Jul 9 13:02:53.136042 containerd[1624]: 2025-07-09 13:02:53.115 [INFO][4564] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali965d39b4d82 ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-9g4vj" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" Jul 9 13:02:53.136042 containerd[1624]: 2025-07-09 13:02:53.117 [INFO][4564] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-9g4vj" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" Jul 9 13:02:53.136092 containerd[1624]: 2025-07-09 13:02:53.118 [INFO][4564] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-9g4vj" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0", GenerateName:"calico-apiserver-697d57c56-", Namespace:"calico-apiserver", SelfLink:"", UID:"385d03ba-5657-4ba7-b837-dbc26e99cf88", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"697d57c56", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253", Pod:"calico-apiserver-697d57c56-9g4vj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali965d39b4d82", MAC:"76:df:5f:ec:f0:a0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:53.136134 containerd[1624]: 2025-07-09 13:02:53.123 [INFO][4564] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" Namespace="calico-apiserver" Pod="calico-apiserver-697d57c56-9g4vj" WorkloadEndpoint="localhost-k8s-calico--apiserver--697d57c56--9g4vj-eth0" Jul 9 13:02:53.187698 containerd[1624]: time="2025-07-09T13:02:53.186436625Z" level=info msg="connecting to shim 3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253" address="unix:///run/containerd/s/9c387cfbac0d6e63cd3cdf7baa5c58dbf0a17c7c89cdad9f11bfec1ed11acbd8" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:53.196784 systemd-networkd[1509]: calic7941e52bb6: Gained IPv6LL Jul 9 13:02:53.208954 systemd[1]: Started cri-containerd-3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253.scope - libcontainer container 3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253. Jul 9 13:02:53.223145 systemd-resolved[1511]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 9 13:02:53.232552 systemd-networkd[1509]: calic4f9e17e3ac: Link UP Jul 9 13:02:53.234938 systemd-networkd[1509]: calic4f9e17e3ac: Gained carrier Jul 9 13:02:53.254363 containerd[1624]: 2025-07-09 13:02:52.965 [INFO][4588] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0 calico-kube-controllers-6b449fff5d- calico-system 0eabfdb8-41ba-4268-b70a-42c305153461 860 0 2025-07-09 13:02:18 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6b449fff5d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6b449fff5d-5tg2l eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic4f9e17e3ac [] [] }} ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Namespace="calico-system" Pod="calico-kube-controllers-6b449fff5d-5tg2l" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-" Jul 9 13:02:53.254363 containerd[1624]: 2025-07-09 13:02:52.965 [INFO][4588] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Namespace="calico-system" Pod="calico-kube-controllers-6b449fff5d-5tg2l" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" Jul 9 13:02:53.254363 containerd[1624]: 2025-07-09 13:02:53.005 [INFO][4612] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" HandleID="k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Workload="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.006 [INFO][4612] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" HandleID="k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Workload="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4f20), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6b449fff5d-5tg2l", "timestamp":"2025-07-09 13:02:53.005975155 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.006 [INFO][4612] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.112 [INFO][4612] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.112 [INFO][4612] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.189 [INFO][4612] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" host="localhost" Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.199 [INFO][4612] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.206 [INFO][4612] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.208 [INFO][4612] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.209 [INFO][4612] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:53.254511 containerd[1624]: 2025-07-09 13:02:53.209 [INFO][4612] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" host="localhost" Jul 9 13:02:53.258795 containerd[1624]: 2025-07-09 13:02:53.210 [INFO][4612] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793 Jul 9 13:02:53.258795 containerd[1624]: 2025-07-09 13:02:53.217 [INFO][4612] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" host="localhost" Jul 9 13:02:53.258795 containerd[1624]: 2025-07-09 13:02:53.228 [INFO][4612] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" host="localhost" Jul 9 13:02:53.258795 containerd[1624]: 2025-07-09 13:02:53.228 [INFO][4612] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" host="localhost" Jul 9 13:02:53.258795 containerd[1624]: 2025-07-09 13:02:53.228 [INFO][4612] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 13:02:53.258795 containerd[1624]: 2025-07-09 13:02:53.228 [INFO][4612] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" HandleID="k8s-pod-network.1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Workload="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" Jul 9 13:02:53.258957 containerd[1624]: 2025-07-09 13:02:53.230 [INFO][4588] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Namespace="calico-system" Pod="calico-kube-controllers-6b449fff5d-5tg2l" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0", GenerateName:"calico-kube-controllers-6b449fff5d-", Namespace:"calico-system", SelfLink:"", UID:"0eabfdb8-41ba-4268-b70a-42c305153461", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b449fff5d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6b449fff5d-5tg2l", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic4f9e17e3ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:53.263970 containerd[1624]: 2025-07-09 13:02:53.230 [INFO][4588] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Namespace="calico-system" Pod="calico-kube-controllers-6b449fff5d-5tg2l" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" Jul 9 13:02:53.263970 containerd[1624]: 2025-07-09 13:02:53.230 [INFO][4588] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic4f9e17e3ac ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Namespace="calico-system" Pod="calico-kube-controllers-6b449fff5d-5tg2l" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" Jul 9 13:02:53.263970 containerd[1624]: 2025-07-09 13:02:53.236 [INFO][4588] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Namespace="calico-system" Pod="calico-kube-controllers-6b449fff5d-5tg2l" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" Jul 9 13:02:53.268805 containerd[1624]: 2025-07-09 13:02:53.238 [INFO][4588] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Namespace="calico-system" Pod="calico-kube-controllers-6b449fff5d-5tg2l" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0", GenerateName:"calico-kube-controllers-6b449fff5d-", Namespace:"calico-system", SelfLink:"", UID:"0eabfdb8-41ba-4268-b70a-42c305153461", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b449fff5d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793", Pod:"calico-kube-controllers-6b449fff5d-5tg2l", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic4f9e17e3ac", MAC:"1a:d0:93:60:b0:59", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:53.268861 containerd[1624]: 2025-07-09 13:02:53.252 [INFO][4588] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" Namespace="calico-system" Pod="calico-kube-controllers-6b449fff5d-5tg2l" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6b449fff5d--5tg2l-eth0" Jul 9 13:02:53.295152 containerd[1624]: time="2025-07-09T13:02:53.295129926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-697d57c56-9g4vj,Uid:385d03ba-5657-4ba7-b837-dbc26e99cf88,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253\"" Jul 9 13:02:53.339103 containerd[1624]: time="2025-07-09T13:02:53.339036886Z" level=info msg="connecting to shim 1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793" address="unix:///run/containerd/s/9ad0037420dbce4fc11e8f71cb090e520cc10ddd671b0e351f58ee79eb35fd19" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:53.363770 systemd[1]: Started cri-containerd-1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793.scope - libcontainer container 1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793. Jul 9 13:02:53.375288 systemd-resolved[1511]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 9 13:02:53.402965 containerd[1624]: time="2025-07-09T13:02:53.402943898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b449fff5d-5tg2l,Uid:0eabfdb8-41ba-4268-b70a-42c305153461,Namespace:calico-system,Attempt:0,} returns sandbox id \"1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793\"" Jul 9 13:02:53.707801 systemd-networkd[1509]: calid97f043feea: Gained IPv6LL Jul 9 13:02:53.771983 systemd-networkd[1509]: vxlan.calico: Gained IPv6LL Jul 9 13:02:54.303934 containerd[1624]: time="2025-07-09T13:02:54.303474365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:54.303934 containerd[1624]: time="2025-07-09T13:02:54.303888143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Jul 9 13:02:54.303934 containerd[1624]: time="2025-07-09T13:02:54.303909729Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:54.307957 containerd[1624]: time="2025-07-09T13:02:54.307942277Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:54.308160 containerd[1624]: time="2025-07-09T13:02:54.308143534Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 2.638644289s" Jul 9 13:02:54.308188 containerd[1624]: time="2025-07-09T13:02:54.308162459Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Jul 9 13:02:54.308949 containerd[1624]: time="2025-07-09T13:02:54.308931274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 9 13:02:54.310129 containerd[1624]: time="2025-07-09T13:02:54.309591073Z" level=info msg="CreateContainer within sandbox \"250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 9 13:02:54.318813 containerd[1624]: time="2025-07-09T13:02:54.318784996Z" level=info msg="Container 9fec4aa8261d1b1ca1fb47cb801dac9d53e386211b8fde3ed97309a78316a1d8: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:54.321217 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount252558529.mount: Deactivated successfully. Jul 9 13:02:54.337282 containerd[1624]: time="2025-07-09T13:02:54.337217968Z" level=info msg="CreateContainer within sandbox \"250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9fec4aa8261d1b1ca1fb47cb801dac9d53e386211b8fde3ed97309a78316a1d8\"" Jul 9 13:02:54.337915 containerd[1624]: time="2025-07-09T13:02:54.337801854Z" level=info msg="StartContainer for \"9fec4aa8261d1b1ca1fb47cb801dac9d53e386211b8fde3ed97309a78316a1d8\"" Jul 9 13:02:54.339193 containerd[1624]: time="2025-07-09T13:02:54.339096674Z" level=info msg="connecting to shim 9fec4aa8261d1b1ca1fb47cb801dac9d53e386211b8fde3ed97309a78316a1d8" address="unix:///run/containerd/s/3a910947b21a77117dd186f194ead8f8e1e8626f6c4e9f079e8decd152c71caa" protocol=ttrpc version=3 Jul 9 13:02:54.356774 systemd[1]: Started cri-containerd-9fec4aa8261d1b1ca1fb47cb801dac9d53e386211b8fde3ed97309a78316a1d8.scope - libcontainer container 9fec4aa8261d1b1ca1fb47cb801dac9d53e386211b8fde3ed97309a78316a1d8. Jul 9 13:02:54.406339 containerd[1624]: time="2025-07-09T13:02:54.406318481Z" level=info msg="StartContainer for \"9fec4aa8261d1b1ca1fb47cb801dac9d53e386211b8fde3ed97309a78316a1d8\" returns successfully" Jul 9 13:02:54.539838 systemd-networkd[1509]: calic4f9e17e3ac: Gained IPv6LL Jul 9 13:02:54.795773 systemd-networkd[1509]: cali24e85249a01: Gained IPv6LL Jul 9 13:02:54.902591 containerd[1624]: time="2025-07-09T13:02:54.902240719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bpjdp,Uid:e7f4ec8f-6937-4878-a69c-40587e0b905d,Namespace:kube-system,Attempt:0,}" Jul 9 13:02:54.902591 containerd[1624]: time="2025-07-09T13:02:54.902240851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4c2t7,Uid:852b8fad-39a8-4dd8-ae90-8162593e4973,Namespace:kube-system,Attempt:0,}" Jul 9 13:02:54.924138 systemd-networkd[1509]: cali965d39b4d82: Gained IPv6LL Jul 9 13:02:54.984953 systemd-networkd[1509]: caliea4a757de68: Link UP Jul 9 13:02:54.986772 systemd-networkd[1509]: caliea4a757de68: Gained carrier Jul 9 13:02:54.998499 containerd[1624]: 2025-07-09 13:02:54.937 [INFO][4818] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0 coredns-668d6bf9bc- kube-system e7f4ec8f-6937-4878-a69c-40587e0b905d 864 0 2025-07-09 13:02:04 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-bpjdp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliea4a757de68 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-bpjdp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bpjdp-" Jul 9 13:02:54.998499 containerd[1624]: 2025-07-09 13:02:54.937 [INFO][4818] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-bpjdp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" Jul 9 13:02:54.998499 containerd[1624]: 2025-07-09 13:02:54.958 [INFO][4846] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" HandleID="k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Workload="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.958 [INFO][4846] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" HandleID="k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Workload="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad4a0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-bpjdp", "timestamp":"2025-07-09 13:02:54.958604464 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.958 [INFO][4846] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.958 [INFO][4846] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.958 [INFO][4846] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.963 [INFO][4846] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" host="localhost" Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.966 [INFO][4846] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.968 [INFO][4846] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.969 [INFO][4846] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.971 [INFO][4846] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:54.998637 containerd[1624]: 2025-07-09 13:02:54.971 [INFO][4846] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" host="localhost" Jul 9 13:02:54.998966 containerd[1624]: 2025-07-09 13:02:54.972 [INFO][4846] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd Jul 9 13:02:54.998966 containerd[1624]: 2025-07-09 13:02:54.975 [INFO][4846] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" host="localhost" Jul 9 13:02:54.998966 containerd[1624]: 2025-07-09 13:02:54.978 [INFO][4846] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" host="localhost" Jul 9 13:02:54.998966 containerd[1624]: 2025-07-09 13:02:54.978 [INFO][4846] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" host="localhost" Jul 9 13:02:54.998966 containerd[1624]: 2025-07-09 13:02:54.978 [INFO][4846] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 13:02:54.998966 containerd[1624]: 2025-07-09 13:02:54.978 [INFO][4846] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" HandleID="k8s-pod-network.f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Workload="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" Jul 9 13:02:54.999081 containerd[1624]: 2025-07-09 13:02:54.980 [INFO][4818] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-bpjdp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e7f4ec8f-6937-4878-a69c-40587e0b905d", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-bpjdp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliea4a757de68", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:54.999227 containerd[1624]: 2025-07-09 13:02:54.980 [INFO][4818] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-bpjdp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" Jul 9 13:02:54.999227 containerd[1624]: 2025-07-09 13:02:54.980 [INFO][4818] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliea4a757de68 ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-bpjdp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" Jul 9 13:02:54.999227 containerd[1624]: 2025-07-09 13:02:54.986 [INFO][4818] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-bpjdp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" Jul 9 13:02:54.999428 containerd[1624]: 2025-07-09 13:02:54.987 [INFO][4818] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-bpjdp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e7f4ec8f-6937-4878-a69c-40587e0b905d", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd", Pod:"coredns-668d6bf9bc-bpjdp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliea4a757de68", MAC:"8a:59:08:cb:8a:2c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:54.999428 containerd[1624]: 2025-07-09 13:02:54.991 [INFO][4818] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-bpjdp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--bpjdp-eth0" Jul 9 13:02:55.018509 containerd[1624]: time="2025-07-09T13:02:55.018242262Z" level=info msg="connecting to shim f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd" address="unix:///run/containerd/s/2a4ff8cd163c8dddc826e43ee60382551e2f3401e516ac99b427a91609479fa4" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:55.043769 systemd[1]: Started cri-containerd-f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd.scope - libcontainer container f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd. Jul 9 13:02:55.056031 systemd-resolved[1511]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 9 13:02:55.091619 systemd-networkd[1509]: cali915a1944849: Link UP Jul 9 13:02:55.092984 systemd-networkd[1509]: cali915a1944849: Gained carrier Jul 9 13:02:55.095509 containerd[1624]: time="2025-07-09T13:02:55.095490135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bpjdp,Uid:e7f4ec8f-6937-4878-a69c-40587e0b905d,Namespace:kube-system,Attempt:0,} returns sandbox id \"f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd\"" Jul 9 13:02:55.098169 containerd[1624]: time="2025-07-09T13:02:55.098102682Z" level=info msg="CreateContainer within sandbox \"f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:54.936 [INFO][4824] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0 coredns-668d6bf9bc- kube-system 852b8fad-39a8-4dd8-ae90-8162593e4973 862 0 2025-07-09 13:02:04 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-4c2t7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali915a1944849 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-4c2t7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--4c2t7-" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:54.936 [INFO][4824] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-4c2t7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:54.965 [INFO][4844] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" HandleID="k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Workload="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:54.965 [INFO][4844] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" HandleID="k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Workload="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5100), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-4c2t7", "timestamp":"2025-07-09 13:02:54.965880007 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:54.965 [INFO][4844] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:54.978 [INFO][4844] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:54.978 [INFO][4844] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.064 [INFO][4844] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.067 [INFO][4844] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.069 [INFO][4844] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.070 [INFO][4844] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.071 [INFO][4844] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.071 [INFO][4844] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.072 [INFO][4844] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.074 [INFO][4844] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.078 [INFO][4844] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.078 [INFO][4844] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" host="localhost" Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.078 [INFO][4844] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 13:02:55.107768 containerd[1624]: 2025-07-09 13:02:55.078 [INFO][4844] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" HandleID="k8s-pod-network.e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Workload="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" Jul 9 13:02:55.108966 containerd[1624]: 2025-07-09 13:02:55.081 [INFO][4824] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-4c2t7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"852b8fad-39a8-4dd8-ae90-8162593e4973", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-4c2t7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali915a1944849", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:55.108966 containerd[1624]: 2025-07-09 13:02:55.082 [INFO][4824] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-4c2t7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" Jul 9 13:02:55.108966 containerd[1624]: 2025-07-09 13:02:55.082 [INFO][4824] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali915a1944849 ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-4c2t7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" Jul 9 13:02:55.108966 containerd[1624]: 2025-07-09 13:02:55.093 [INFO][4824] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-4c2t7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" Jul 9 13:02:55.108966 containerd[1624]: 2025-07-09 13:02:55.095 [INFO][4824] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-4c2t7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"852b8fad-39a8-4dd8-ae90-8162593e4973", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 13, 2, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c", Pod:"coredns-668d6bf9bc-4c2t7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali915a1944849", MAC:"5e:89:0e:aa:93:dd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 13:02:55.108966 containerd[1624]: 2025-07-09 13:02:55.102 [INFO][4824] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-4c2t7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--4c2t7-eth0" Jul 9 13:02:55.109873 containerd[1624]: time="2025-07-09T13:02:55.109840035Z" level=info msg="Container 5db85f17e2325e1b9e7081b2436e0345156944d97e7474aad7f5a417afbb2885: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:55.117006 containerd[1624]: time="2025-07-09T13:02:55.116971186Z" level=info msg="CreateContainer within sandbox \"f5e1d4f08bd04e26628cb66fa7a4ac4261fc0d26912b2e259e4825bf125659dd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5db85f17e2325e1b9e7081b2436e0345156944d97e7474aad7f5a417afbb2885\"" Jul 9 13:02:55.117731 containerd[1624]: time="2025-07-09T13:02:55.117580496Z" level=info msg="StartContainer for \"5db85f17e2325e1b9e7081b2436e0345156944d97e7474aad7f5a417afbb2885\"" Jul 9 13:02:55.118593 containerd[1624]: time="2025-07-09T13:02:55.118323109Z" level=info msg="connecting to shim 5db85f17e2325e1b9e7081b2436e0345156944d97e7474aad7f5a417afbb2885" address="unix:///run/containerd/s/2a4ff8cd163c8dddc826e43ee60382551e2f3401e516ac99b427a91609479fa4" protocol=ttrpc version=3 Jul 9 13:02:55.130099 containerd[1624]: time="2025-07-09T13:02:55.130044562Z" level=info msg="connecting to shim e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c" address="unix:///run/containerd/s/e42d2ae98179ba8d98496579a2d1bfa2e04b59322ac59e0956e3f3ec44833154" namespace=k8s.io protocol=ttrpc version=3 Jul 9 13:02:55.135158 systemd[1]: Started cri-containerd-5db85f17e2325e1b9e7081b2436e0345156944d97e7474aad7f5a417afbb2885.scope - libcontainer container 5db85f17e2325e1b9e7081b2436e0345156944d97e7474aad7f5a417afbb2885. Jul 9 13:02:55.150822 systemd[1]: Started cri-containerd-e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c.scope - libcontainer container e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c. Jul 9 13:02:55.165514 systemd-resolved[1511]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 9 13:02:55.175202 containerd[1624]: time="2025-07-09T13:02:55.175179347Z" level=info msg="StartContainer for \"5db85f17e2325e1b9e7081b2436e0345156944d97e7474aad7f5a417afbb2885\" returns successfully" Jul 9 13:02:55.201748 containerd[1624]: time="2025-07-09T13:02:55.201715984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4c2t7,Uid:852b8fad-39a8-4dd8-ae90-8162593e4973,Namespace:kube-system,Attempt:0,} returns sandbox id \"e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c\"" Jul 9 13:02:55.203764 containerd[1624]: time="2025-07-09T13:02:55.203737124Z" level=info msg="CreateContainer within sandbox \"e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 9 13:02:55.210403 containerd[1624]: time="2025-07-09T13:02:55.210383115Z" level=info msg="Container 9fd51e3835f1a1c276aa6fe5ed8137b23ea71fc2a45010aea7903f3f37bd89e7: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:55.219927 containerd[1624]: time="2025-07-09T13:02:55.219906279Z" level=info msg="CreateContainer within sandbox \"e9e374c5d1af0ef369a8ecd6b824cbe1a2e8dd63fed693125f2c8fb874388b0c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9fd51e3835f1a1c276aa6fe5ed8137b23ea71fc2a45010aea7903f3f37bd89e7\"" Jul 9 13:02:55.220497 containerd[1624]: time="2025-07-09T13:02:55.220222005Z" level=info msg="StartContainer for \"9fd51e3835f1a1c276aa6fe5ed8137b23ea71fc2a45010aea7903f3f37bd89e7\"" Jul 9 13:02:55.221120 containerd[1624]: time="2025-07-09T13:02:55.221108299Z" level=info msg="connecting to shim 9fd51e3835f1a1c276aa6fe5ed8137b23ea71fc2a45010aea7903f3f37bd89e7" address="unix:///run/containerd/s/e42d2ae98179ba8d98496579a2d1bfa2e04b59322ac59e0956e3f3ec44833154" protocol=ttrpc version=3 Jul 9 13:02:55.235769 systemd[1]: Started cri-containerd-9fd51e3835f1a1c276aa6fe5ed8137b23ea71fc2a45010aea7903f3f37bd89e7.scope - libcontainer container 9fd51e3835f1a1c276aa6fe5ed8137b23ea71fc2a45010aea7903f3f37bd89e7. Jul 9 13:02:55.254639 containerd[1624]: time="2025-07-09T13:02:55.254605565Z" level=info msg="StartContainer for \"9fd51e3835f1a1c276aa6fe5ed8137b23ea71fc2a45010aea7903f3f37bd89e7\" returns successfully" Jul 9 13:02:55.306612 kubelet[2942]: I0709 13:02:55.304765 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-4c2t7" podStartSLOduration=51.304748995 podStartE2EDuration="51.304748995s" podCreationTimestamp="2025-07-09 13:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 13:02:55.304455259 +0000 UTC m=+58.568220771" watchObservedRunningTime="2025-07-09 13:02:55.304748995 +0000 UTC m=+58.568514507" Jul 9 13:02:55.315608 kubelet[2942]: I0709 13:02:55.315063 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-bpjdp" podStartSLOduration=51.315050518 podStartE2EDuration="51.315050518s" podCreationTimestamp="2025-07-09 13:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 13:02:55.313725151 +0000 UTC m=+58.577490671" watchObservedRunningTime="2025-07-09 13:02:55.315050518 +0000 UTC m=+58.578816038" Jul 9 13:02:56.523777 systemd-networkd[1509]: caliea4a757de68: Gained IPv6LL Jul 9 13:02:56.908774 systemd-networkd[1509]: cali915a1944849: Gained IPv6LL Jul 9 13:02:57.446077 containerd[1624]: time="2025-07-09T13:02:57.445626628Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:57.446077 containerd[1624]: time="2025-07-09T13:02:57.445997943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Jul 9 13:02:57.446077 containerd[1624]: time="2025-07-09T13:02:57.446043370Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:57.447011 containerd[1624]: time="2025-07-09T13:02:57.446999528Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:02:57.447507 containerd[1624]: time="2025-07-09T13:02:57.447370619Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 3.138423602s" Jul 9 13:02:57.447507 containerd[1624]: time="2025-07-09T13:02:57.447388663Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Jul 9 13:02:57.448089 containerd[1624]: time="2025-07-09T13:02:57.448077259Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 9 13:02:57.450489 containerd[1624]: time="2025-07-09T13:02:57.450008155Z" level=info msg="CreateContainer within sandbox \"e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 9 13:02:57.461706 containerd[1624]: time="2025-07-09T13:02:57.461682511Z" level=info msg="Container ed264edefeb9266a334610b332b4f65d057a24329691c8c06182aba067318e11: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:02:57.490109 containerd[1624]: time="2025-07-09T13:02:57.490088783Z" level=info msg="CreateContainer within sandbox \"e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"ed264edefeb9266a334610b332b4f65d057a24329691c8c06182aba067318e11\"" Jul 9 13:02:57.490540 containerd[1624]: time="2025-07-09T13:02:57.490523155Z" level=info msg="StartContainer for \"ed264edefeb9266a334610b332b4f65d057a24329691c8c06182aba067318e11\"" Jul 9 13:02:57.491764 containerd[1624]: time="2025-07-09T13:02:57.491225452Z" level=info msg="connecting to shim ed264edefeb9266a334610b332b4f65d057a24329691c8c06182aba067318e11" address="unix:///run/containerd/s/9fcb92d4052e6c5a8523efae7a892a4dc08b540cbb6b3423838347dc165b0afd" protocol=ttrpc version=3 Jul 9 13:02:57.511809 systemd[1]: Started cri-containerd-ed264edefeb9266a334610b332b4f65d057a24329691c8c06182aba067318e11.scope - libcontainer container ed264edefeb9266a334610b332b4f65d057a24329691c8c06182aba067318e11. Jul 9 13:02:57.547966 containerd[1624]: time="2025-07-09T13:02:57.547917557Z" level=info msg="StartContainer for \"ed264edefeb9266a334610b332b4f65d057a24329691c8c06182aba067318e11\" returns successfully" Jul 9 13:03:00.555088 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2556554881.mount: Deactivated successfully. Jul 9 13:03:01.109416 containerd[1624]: time="2025-07-09T13:03:01.109388121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:01.111605 containerd[1624]: time="2025-07-09T13:03:01.111590657Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Jul 9 13:03:01.123804 containerd[1624]: time="2025-07-09T13:03:01.123614096Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:01.124689 containerd[1624]: time="2025-07-09T13:03:01.124623327Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:01.125277 containerd[1624]: time="2025-07-09T13:03:01.125066889Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.676974383s" Jul 9 13:03:01.125277 containerd[1624]: time="2025-07-09T13:03:01.125083451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Jul 9 13:03:01.125977 containerd[1624]: time="2025-07-09T13:03:01.125763931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 9 13:03:01.127211 containerd[1624]: time="2025-07-09T13:03:01.127199318Z" level=info msg="CreateContainer within sandbox \"803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 9 13:03:01.139350 containerd[1624]: time="2025-07-09T13:03:01.139323318Z" level=info msg="Container 91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:03:01.165996 containerd[1624]: time="2025-07-09T13:03:01.165936301Z" level=info msg="CreateContainer within sandbox \"803e0577c26910cc76c1cba0bc8ca49527622bb067bf5369c166e091f8fc02fc\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\"" Jul 9 13:03:01.166747 containerd[1624]: time="2025-07-09T13:03:01.166361724Z" level=info msg="StartContainer for \"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\"" Jul 9 13:03:01.167131 containerd[1624]: time="2025-07-09T13:03:01.167106818Z" level=info msg="connecting to shim 91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f" address="unix:///run/containerd/s/db40ece796bf3da76bb06d714bfd41395d976e70bc26a3cc3b51f858a93d3229" protocol=ttrpc version=3 Jul 9 13:03:01.199839 systemd[1]: Started cri-containerd-91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f.scope - libcontainer container 91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f. Jul 9 13:03:01.247152 containerd[1624]: time="2025-07-09T13:03:01.247117676Z" level=info msg="StartContainer for \"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\" returns successfully" Jul 9 13:03:01.434474 kubelet[2942]: I0709 13:03:01.431311 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-2brrp" podStartSLOduration=35.400409377 podStartE2EDuration="44.431296961s" podCreationTimestamp="2025-07-09 13:02:17 +0000 UTC" firstStartedPulling="2025-07-09 13:02:52.094728132 +0000 UTC m=+55.358493641" lastFinishedPulling="2025-07-09 13:03:01.125615712 +0000 UTC m=+64.389381225" observedRunningTime="2025-07-09 13:03:01.430764641 +0000 UTC m=+64.694530160" watchObservedRunningTime="2025-07-09 13:03:01.431296961 +0000 UTC m=+64.695062475" Jul 9 13:03:01.503102 containerd[1624]: time="2025-07-09T13:03:01.503063462Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\" id:\"1edd3b0884c84544ab4a3479fc75c3aca39aaad9ebe6b9fd02b6403c97350c0c\" pid:5147 exit_status:1 exited_at:{seconds:1752066181 nanos:468467755}" Jul 9 13:03:02.394695 containerd[1624]: time="2025-07-09T13:03:02.394662105Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\" id:\"342656f312dac7a5bcf5c51970a36c4497b430b9d26c68d351f2404898f7a4e8\" pid:5171 exit_status:1 exited_at:{seconds:1752066182 nanos:394480335}" Jul 9 13:03:03.429805 containerd[1624]: time="2025-07-09T13:03:03.429776180Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\" id:\"a85f423f74ad2f2cc28878ad6b290ac6fdc11bc02169c8cf35809f07eac28308\" pid:5199 exit_status:1 exited_at:{seconds:1752066183 nanos:428639856}" Jul 9 13:03:04.230519 containerd[1624]: time="2025-07-09T13:03:04.230410993Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:04.233330 containerd[1624]: time="2025-07-09T13:03:04.233293082Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Jul 9 13:03:04.234967 containerd[1624]: time="2025-07-09T13:03:04.234949280Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:04.236237 containerd[1624]: time="2025-07-09T13:03:04.236207057Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:04.236941 containerd[1624]: time="2025-07-09T13:03:04.236655093Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 3.11087538s" Jul 9 13:03:04.236941 containerd[1624]: time="2025-07-09T13:03:04.236687779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 9 13:03:04.237742 containerd[1624]: time="2025-07-09T13:03:04.237725579Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 9 13:03:04.241458 containerd[1624]: time="2025-07-09T13:03:04.241415590Z" level=info msg="CreateContainer within sandbox \"6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 9 13:03:04.246474 containerd[1624]: time="2025-07-09T13:03:04.244710889Z" level=info msg="Container b0745b55c6b6644681fd635de04fcd01355c524eeff46c270b5044052317dd30: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:03:04.249162 containerd[1624]: time="2025-07-09T13:03:04.248956974Z" level=info msg="CreateContainer within sandbox \"6ebc7defbad936b49bda3b40957c4e31b41c0a471e7d12479f017f51b1670560\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b0745b55c6b6644681fd635de04fcd01355c524eeff46c270b5044052317dd30\"" Jul 9 13:03:04.250567 containerd[1624]: time="2025-07-09T13:03:04.249752598Z" level=info msg="StartContainer for \"b0745b55c6b6644681fd635de04fcd01355c524eeff46c270b5044052317dd30\"" Jul 9 13:03:04.250928 containerd[1624]: time="2025-07-09T13:03:04.250908116Z" level=info msg="connecting to shim b0745b55c6b6644681fd635de04fcd01355c524eeff46c270b5044052317dd30" address="unix:///run/containerd/s/b1c1efba5a728f71bbfb8b833f9c2f917377afce40192230b4131b52a0c04cea" protocol=ttrpc version=3 Jul 9 13:03:04.269795 systemd[1]: Started cri-containerd-b0745b55c6b6644681fd635de04fcd01355c524eeff46c270b5044052317dd30.scope - libcontainer container b0745b55c6b6644681fd635de04fcd01355c524eeff46c270b5044052317dd30. Jul 9 13:03:04.323329 containerd[1624]: time="2025-07-09T13:03:04.323250114Z" level=info msg="StartContainer for \"b0745b55c6b6644681fd635de04fcd01355c524eeff46c270b5044052317dd30\" returns successfully" Jul 9 13:03:05.257410 containerd[1624]: time="2025-07-09T13:03:05.257381435Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:05.260061 containerd[1624]: time="2025-07-09T13:03:05.260040647Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 9 13:03:05.268237 containerd[1624]: time="2025-07-09T13:03:05.268204965Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 1.030460002s" Jul 9 13:03:05.268237 containerd[1624]: time="2025-07-09T13:03:05.268232869Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 9 13:03:05.269570 containerd[1624]: time="2025-07-09T13:03:05.269520199Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 9 13:03:05.270276 containerd[1624]: time="2025-07-09T13:03:05.270181194Z" level=info msg="CreateContainer within sandbox \"3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 9 13:03:05.275683 containerd[1624]: time="2025-07-09T13:03:05.274230714Z" level=info msg="Container 3bdc64b476486d5beaf5424bf42552f1fdc723e7f622bf017f9ccf7aede00a94: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:03:05.286610 containerd[1624]: time="2025-07-09T13:03:05.286582269Z" level=info msg="CreateContainer within sandbox \"3df2e21d2cd0df8f5a6a38ee768dfd7235809fbb3cbba878d91632958553e253\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3bdc64b476486d5beaf5424bf42552f1fdc723e7f622bf017f9ccf7aede00a94\"" Jul 9 13:03:05.287897 containerd[1624]: time="2025-07-09T13:03:05.286953172Z" level=info msg="StartContainer for \"3bdc64b476486d5beaf5424bf42552f1fdc723e7f622bf017f9ccf7aede00a94\"" Jul 9 13:03:05.287897 containerd[1624]: time="2025-07-09T13:03:05.287754517Z" level=info msg="connecting to shim 3bdc64b476486d5beaf5424bf42552f1fdc723e7f622bf017f9ccf7aede00a94" address="unix:///run/containerd/s/9c387cfbac0d6e63cd3cdf7baa5c58dbf0a17c7c89cdad9f11bfec1ed11acbd8" protocol=ttrpc version=3 Jul 9 13:03:05.307774 systemd[1]: Started cri-containerd-3bdc64b476486d5beaf5424bf42552f1fdc723e7f622bf017f9ccf7aede00a94.scope - libcontainer container 3bdc64b476486d5beaf5424bf42552f1fdc723e7f622bf017f9ccf7aede00a94. Jul 9 13:03:05.384207 containerd[1624]: time="2025-07-09T13:03:05.384167366Z" level=info msg="StartContainer for \"3bdc64b476486d5beaf5424bf42552f1fdc723e7f622bf017f9ccf7aede00a94\" returns successfully" Jul 9 13:03:05.401991 kubelet[2942]: I0709 13:03:05.401692 2942 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 9 13:03:05.406450 kubelet[2942]: I0709 13:03:05.406253 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-697d57c56-499xq" podStartSLOduration=40.263522071 podStartE2EDuration="51.406239389s" podCreationTimestamp="2025-07-09 13:02:14 +0000 UTC" firstStartedPulling="2025-07-09 13:02:53.094888295 +0000 UTC m=+56.358653805" lastFinishedPulling="2025-07-09 13:03:04.237605609 +0000 UTC m=+67.501371123" observedRunningTime="2025-07-09 13:03:04.367872029 +0000 UTC m=+67.631637548" watchObservedRunningTime="2025-07-09 13:03:05.406239389 +0000 UTC m=+68.670004910" Jul 9 13:03:06.397133 kubelet[2942]: I0709 13:03:06.397106 2942 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 9 13:03:12.983534 containerd[1624]: time="2025-07-09T13:03:12.983415594Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:12.984861 containerd[1624]: time="2025-07-09T13:03:12.984205692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Jul 9 13:03:12.984982 containerd[1624]: time="2025-07-09T13:03:12.984971942Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:12.986590 containerd[1624]: time="2025-07-09T13:03:12.986517371Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:12.987258 containerd[1624]: time="2025-07-09T13:03:12.987242896Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 7.717703527s" Jul 9 13:03:12.987301 containerd[1624]: time="2025-07-09T13:03:12.987261882Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Jul 9 13:03:13.019050 containerd[1624]: time="2025-07-09T13:03:13.018816096Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 9 13:03:13.099416 containerd[1624]: time="2025-07-09T13:03:13.099391231Z" level=info msg="CreateContainer within sandbox \"1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 9 13:03:13.105855 containerd[1624]: time="2025-07-09T13:03:13.105824168Z" level=info msg="Container ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:03:13.108443 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3804599883.mount: Deactivated successfully. Jul 9 13:03:13.115013 containerd[1624]: time="2025-07-09T13:03:13.114981012Z" level=info msg="CreateContainer within sandbox \"1bd05729b37a993e1c7f1b72cd7ae537a28f0c63bbe2eae9d9a8a2b166980793\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569\"" Jul 9 13:03:13.119339 containerd[1624]: time="2025-07-09T13:03:13.119318187Z" level=info msg="StartContainer for \"ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569\"" Jul 9 13:03:13.120728 containerd[1624]: time="2025-07-09T13:03:13.120702313Z" level=info msg="connecting to shim ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569" address="unix:///run/containerd/s/9ad0037420dbce4fc11e8f71cb090e520cc10ddd671b0e351f58ee79eb35fd19" protocol=ttrpc version=3 Jul 9 13:03:13.147212 systemd[1]: Started cri-containerd-ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569.scope - libcontainer container ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569. Jul 9 13:03:13.186938 containerd[1624]: time="2025-07-09T13:03:13.186913716Z" level=info msg="StartContainer for \"ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569\" returns successfully" Jul 9 13:03:13.468767 kubelet[2942]: I0709 13:03:13.468226 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-697d57c56-9g4vj" podStartSLOduration=47.495452197 podStartE2EDuration="59.468211667s" podCreationTimestamp="2025-07-09 13:02:14 +0000 UTC" firstStartedPulling="2025-07-09 13:02:53.296158971 +0000 UTC m=+56.559924482" lastFinishedPulling="2025-07-09 13:03:05.268918438 +0000 UTC m=+68.532683952" observedRunningTime="2025-07-09 13:03:05.408213036 +0000 UTC m=+68.671978555" watchObservedRunningTime="2025-07-09 13:03:13.468211667 +0000 UTC m=+76.731977186" Jul 9 13:03:13.468767 kubelet[2942]: I0709 13:03:13.468406 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6b449fff5d-5tg2l" podStartSLOduration=35.869184104 podStartE2EDuration="55.468401978s" podCreationTimestamp="2025-07-09 13:02:18 +0000 UTC" firstStartedPulling="2025-07-09 13:02:53.403887089 +0000 UTC m=+56.667652600" lastFinishedPulling="2025-07-09 13:03:13.003104964 +0000 UTC m=+76.266870474" observedRunningTime="2025-07-09 13:03:13.459556805 +0000 UTC m=+76.723322319" watchObservedRunningTime="2025-07-09 13:03:13.468401978 +0000 UTC m=+76.732167491" Jul 9 13:03:13.582731 containerd[1624]: time="2025-07-09T13:03:13.582688086Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569\" id:\"943a0e5e4bdb301300a2f24d39fdbdacf14e73d5018d3350743ac9ec0fbea176\" pid:5362 exited_at:{seconds:1752066193 nanos:573084360}" Jul 9 13:03:14.845183 containerd[1624]: time="2025-07-09T13:03:14.845095500Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:14.845649 containerd[1624]: time="2025-07-09T13:03:14.845481525Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Jul 9 13:03:14.846222 containerd[1624]: time="2025-07-09T13:03:14.846206338Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:14.848273 containerd[1624]: time="2025-07-09T13:03:14.848238726Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:14.848572 containerd[1624]: time="2025-07-09T13:03:14.848461535Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 1.829620572s" Jul 9 13:03:14.848572 containerd[1624]: time="2025-07-09T13:03:14.848481744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Jul 9 13:03:14.849859 containerd[1624]: time="2025-07-09T13:03:14.849846420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 9 13:03:14.858271 containerd[1624]: time="2025-07-09T13:03:14.858233245Z" level=info msg="CreateContainer within sandbox \"250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 9 13:03:14.871685 containerd[1624]: time="2025-07-09T13:03:14.871171223Z" level=info msg="Container 44d7b9a28078c1c59e0bf8ccc4c8d6a60f79876de07412f996b2e1e66cf545d8: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:03:14.889660 containerd[1624]: time="2025-07-09T13:03:14.889631817Z" level=info msg="CreateContainer within sandbox \"250da739b1c00333fdb2894d229749ae4aed8b4def92a0f69826accf7dd61740\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"44d7b9a28078c1c59e0bf8ccc4c8d6a60f79876de07412f996b2e1e66cf545d8\"" Jul 9 13:03:14.890692 containerd[1624]: time="2025-07-09T13:03:14.890034716Z" level=info msg="StartContainer for \"44d7b9a28078c1c59e0bf8ccc4c8d6a60f79876de07412f996b2e1e66cf545d8\"" Jul 9 13:03:14.891617 containerd[1624]: time="2025-07-09T13:03:14.891064497Z" level=info msg="connecting to shim 44d7b9a28078c1c59e0bf8ccc4c8d6a60f79876de07412f996b2e1e66cf545d8" address="unix:///run/containerd/s/3a910947b21a77117dd186f194ead8f8e1e8626f6c4e9f079e8decd152c71caa" protocol=ttrpc version=3 Jul 9 13:03:14.908850 systemd[1]: Started cri-containerd-44d7b9a28078c1c59e0bf8ccc4c8d6a60f79876de07412f996b2e1e66cf545d8.scope - libcontainer container 44d7b9a28078c1c59e0bf8ccc4c8d6a60f79876de07412f996b2e1e66cf545d8. Jul 9 13:03:14.944166 containerd[1624]: time="2025-07-09T13:03:14.944130290Z" level=info msg="StartContainer for \"44d7b9a28078c1c59e0bf8ccc4c8d6a60f79876de07412f996b2e1e66cf545d8\" returns successfully" Jul 9 13:03:16.529415 kubelet[2942]: I0709 13:03:16.528982 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-nhvjr" podStartSLOduration=35.30120852 podStartE2EDuration="58.481565837s" podCreationTimestamp="2025-07-09 13:02:18 +0000 UTC" firstStartedPulling="2025-07-09 13:02:51.669169293 +0000 UTC m=+54.932934803" lastFinishedPulling="2025-07-09 13:03:14.849526607 +0000 UTC m=+78.113292120" observedRunningTime="2025-07-09 13:03:16.444260303 +0000 UTC m=+79.708025830" watchObservedRunningTime="2025-07-09 13:03:16.481565837 +0000 UTC m=+79.745331357" Jul 9 13:03:16.657466 kubelet[2942]: I0709 13:03:16.654833 2942 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 9 13:03:16.657466 kubelet[2942]: I0709 13:03:16.657413 2942 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 9 13:03:17.162403 systemd[1]: Started sshd@7-139.178.70.105:22-139.178.68.195:34886.service - OpenSSH per-connection server daemon (139.178.68.195:34886). Jul 9 13:03:17.274151 sshd[5414]: Accepted publickey for core from 139.178.68.195 port 34886 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:17.277075 sshd-session[5414]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:17.283631 systemd-logind[1599]: New session 10 of user core. Jul 9 13:03:17.288823 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 9 13:03:18.177590 sshd[5417]: Connection closed by 139.178.68.195 port 34886 Jul 9 13:03:18.178078 sshd-session[5414]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:18.191263 systemd[1]: sshd@7-139.178.70.105:22-139.178.68.195:34886.service: Deactivated successfully. Jul 9 13:03:18.194115 systemd[1]: session-10.scope: Deactivated successfully. Jul 9 13:03:18.195467 systemd-logind[1599]: Session 10 logged out. Waiting for processes to exit. Jul 9 13:03:18.196614 systemd-logind[1599]: Removed session 10. Jul 9 13:03:19.962339 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount567962266.mount: Deactivated successfully. Jul 9 13:03:20.447962 containerd[1624]: time="2025-07-09T13:03:20.447807513Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:20.476243 containerd[1624]: time="2025-07-09T13:03:20.460870471Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Jul 9 13:03:20.481458 containerd[1624]: time="2025-07-09T13:03:20.481418702Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:20.488605 containerd[1624]: time="2025-07-09T13:03:20.488581972Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 13:03:20.496457 containerd[1624]: time="2025-07-09T13:03:20.488920971Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 5.638846113s" Jul 9 13:03:20.496457 containerd[1624]: time="2025-07-09T13:03:20.488939460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Jul 9 13:03:20.715008 containerd[1624]: time="2025-07-09T13:03:20.714966162Z" level=info msg="CreateContainer within sandbox \"e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 9 13:03:20.732090 containerd[1624]: time="2025-07-09T13:03:20.731800385Z" level=info msg="Container 9db829820d253fde6e90fd515db672edcd08c6da27cc273946834dcf6041a89d: CDI devices from CRI Config.CDIDevices: []" Jul 9 13:03:20.859000 containerd[1624]: time="2025-07-09T13:03:20.858977282Z" level=info msg="CreateContainer within sandbox \"e9e6499833712897826778f96af0b4fd5c1b77b97bac2127f1a9b5a21428ee71\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"9db829820d253fde6e90fd515db672edcd08c6da27cc273946834dcf6041a89d\"" Jul 9 13:03:20.859622 containerd[1624]: time="2025-07-09T13:03:20.859446675Z" level=info msg="StartContainer for \"9db829820d253fde6e90fd515db672edcd08c6da27cc273946834dcf6041a89d\"" Jul 9 13:03:20.865120 containerd[1624]: time="2025-07-09T13:03:20.865092612Z" level=info msg="connecting to shim 9db829820d253fde6e90fd515db672edcd08c6da27cc273946834dcf6041a89d" address="unix:///run/containerd/s/9fcb92d4052e6c5a8523efae7a892a4dc08b540cbb6b3423838347dc165b0afd" protocol=ttrpc version=3 Jul 9 13:03:20.995828 systemd[1]: Started cri-containerd-9db829820d253fde6e90fd515db672edcd08c6da27cc273946834dcf6041a89d.scope - libcontainer container 9db829820d253fde6e90fd515db672edcd08c6da27cc273946834dcf6041a89d. Jul 9 13:03:21.061826 containerd[1624]: time="2025-07-09T13:03:21.061799714Z" level=info msg="StartContainer for \"9db829820d253fde6e90fd515db672edcd08c6da27cc273946834dcf6041a89d\" returns successfully" Jul 9 13:03:21.784633 kubelet[2942]: I0709 13:03:21.783285 2942 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-55c7b9bd44-qgllz" podStartSLOduration=1.977545261 podStartE2EDuration="30.771954454s" podCreationTimestamp="2025-07-09 13:02:51 +0000 UTC" firstStartedPulling="2025-07-09 13:02:51.843884728 +0000 UTC m=+55.107650238" lastFinishedPulling="2025-07-09 13:03:20.638293922 +0000 UTC m=+83.902059431" observedRunningTime="2025-07-09 13:03:21.767826771 +0000 UTC m=+85.031592289" watchObservedRunningTime="2025-07-09 13:03:21.771954454 +0000 UTC m=+85.035719961" Jul 9 13:03:22.971989 containerd[1624]: time="2025-07-09T13:03:22.971947832Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f\" id:\"21fc0bb8a608736d9afe574d796ddcc037ff88d1708d8957aa8ee732129fdc13\" pid:5485 exited_at:{seconds:1752066202 nanos:955876691}" Jul 9 13:03:23.235877 systemd[1]: Started sshd@8-139.178.70.105:22-139.178.68.195:34826.service - OpenSSH per-connection server daemon (139.178.68.195:34826). Jul 9 13:03:23.444874 sshd[5508]: Accepted publickey for core from 139.178.68.195 port 34826 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:23.446503 sshd-session[5508]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:23.450967 systemd-logind[1599]: New session 11 of user core. Jul 9 13:03:23.457981 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 9 13:03:24.649413 sshd[5511]: Connection closed by 139.178.68.195 port 34826 Jul 9 13:03:24.648790 sshd-session[5508]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:24.651543 systemd[1]: sshd@8-139.178.70.105:22-139.178.68.195:34826.service: Deactivated successfully. Jul 9 13:03:24.652971 systemd[1]: session-11.scope: Deactivated successfully. Jul 9 13:03:24.653971 systemd-logind[1599]: Session 11 logged out. Waiting for processes to exit. Jul 9 13:03:24.655516 systemd-logind[1599]: Removed session 11. Jul 9 13:03:28.032075 kubelet[2942]: I0709 13:03:28.031920 2942 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 9 13:03:28.968200 kubelet[2942]: I0709 13:03:28.967945 2942 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 9 13:03:29.556229 containerd[1624]: time="2025-07-09T13:03:29.556178874Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569\" id:\"82a0b3964b3bbd9fdfbe7a469c35defcbd743b7be834d3d110fb169267d5c6c9\" pid:5541 exited_at:{seconds:1752066209 nanos:555824697}" Jul 9 13:03:29.658846 systemd[1]: Started sshd@9-139.178.70.105:22-139.178.68.195:39190.service - OpenSSH per-connection server daemon (139.178.68.195:39190). Jul 9 13:03:29.742920 sshd[5551]: Accepted publickey for core from 139.178.68.195 port 39190 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:29.744407 sshd-session[5551]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:29.747486 systemd-logind[1599]: New session 12 of user core. Jul 9 13:03:29.754761 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 9 13:03:30.078384 sshd[5554]: Connection closed by 139.178.68.195 port 39190 Jul 9 13:03:30.080084 sshd-session[5551]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:30.082898 systemd-logind[1599]: Session 12 logged out. Waiting for processes to exit. Jul 9 13:03:30.082986 systemd[1]: sshd@9-139.178.70.105:22-139.178.68.195:39190.service: Deactivated successfully. Jul 9 13:03:30.084231 systemd[1]: session-12.scope: Deactivated successfully. Jul 9 13:03:30.085458 systemd-logind[1599]: Removed session 12. Jul 9 13:03:33.719427 containerd[1624]: time="2025-07-09T13:03:33.718325682Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\" id:\"01ee09aeaccfd5e1d95a447a554875e0a2e6b997e77e7a344a7c87a5ea5f733a\" pid:5584 exited_at:{seconds:1752066213 nanos:717797687}" Jul 9 13:03:35.116263 systemd[1]: Started sshd@10-139.178.70.105:22-139.178.68.195:39204.service - OpenSSH per-connection server daemon (139.178.68.195:39204). Jul 9 13:03:35.279058 sshd[5600]: Accepted publickey for core from 139.178.68.195 port 39204 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:35.285198 sshd-session[5600]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:35.289149 systemd-logind[1599]: New session 13 of user core. Jul 9 13:03:35.292768 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 9 13:03:35.812688 sshd[5605]: Connection closed by 139.178.68.195 port 39204 Jul 9 13:03:35.813044 sshd-session[5600]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:35.821940 systemd[1]: sshd@10-139.178.70.105:22-139.178.68.195:39204.service: Deactivated successfully. Jul 9 13:03:35.822968 systemd[1]: session-13.scope: Deactivated successfully. Jul 9 13:03:35.823819 systemd-logind[1599]: Session 13 logged out. Waiting for processes to exit. Jul 9 13:03:35.825061 systemd[1]: Started sshd@11-139.178.70.105:22-139.178.68.195:39218.service - OpenSSH per-connection server daemon (139.178.68.195:39218). Jul 9 13:03:35.826593 systemd-logind[1599]: Removed session 13. Jul 9 13:03:35.888294 sshd[5619]: Accepted publickey for core from 139.178.68.195 port 39218 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:35.889176 sshd-session[5619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:35.892430 systemd-logind[1599]: New session 14 of user core. Jul 9 13:03:35.901791 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 9 13:03:36.035550 sshd[5622]: Connection closed by 139.178.68.195 port 39218 Jul 9 13:03:36.036725 sshd-session[5619]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:36.044559 systemd[1]: sshd@11-139.178.70.105:22-139.178.68.195:39218.service: Deactivated successfully. Jul 9 13:03:36.046324 systemd[1]: session-14.scope: Deactivated successfully. Jul 9 13:03:36.048130 systemd-logind[1599]: Session 14 logged out. Waiting for processes to exit. Jul 9 13:03:36.051415 systemd[1]: Started sshd@12-139.178.70.105:22-139.178.68.195:39234.service - OpenSSH per-connection server daemon (139.178.68.195:39234). Jul 9 13:03:36.053767 systemd-logind[1599]: Removed session 14. Jul 9 13:03:36.101167 sshd[5632]: Accepted publickey for core from 139.178.68.195 port 39234 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:36.102667 sshd-session[5632]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:36.108720 systemd-logind[1599]: New session 15 of user core. Jul 9 13:03:36.115827 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 9 13:03:36.214800 sshd[5635]: Connection closed by 139.178.68.195 port 39234 Jul 9 13:03:36.215170 sshd-session[5632]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:36.217457 systemd[1]: sshd@12-139.178.70.105:22-139.178.68.195:39234.service: Deactivated successfully. Jul 9 13:03:36.218948 systemd[1]: session-15.scope: Deactivated successfully. Jul 9 13:03:36.219598 systemd-logind[1599]: Session 15 logged out. Waiting for processes to exit. Jul 9 13:03:36.220627 systemd-logind[1599]: Removed session 15. Jul 9 13:03:41.230547 systemd[1]: Started sshd@13-139.178.70.105:22-139.178.68.195:45690.service - OpenSSH per-connection server daemon (139.178.68.195:45690). Jul 9 13:03:41.386000 sshd[5646]: Accepted publickey for core from 139.178.68.195 port 45690 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:41.386815 sshd-session[5646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:41.389331 systemd-logind[1599]: New session 16 of user core. Jul 9 13:03:41.397790 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 9 13:03:41.746080 sshd[5649]: Connection closed by 139.178.68.195 port 45690 Jul 9 13:03:41.746419 sshd-session[5646]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:41.748394 systemd-logind[1599]: Session 16 logged out. Waiting for processes to exit. Jul 9 13:03:41.748548 systemd[1]: sshd@13-139.178.70.105:22-139.178.68.195:45690.service: Deactivated successfully. Jul 9 13:03:41.749768 systemd[1]: session-16.scope: Deactivated successfully. Jul 9 13:03:41.750958 systemd-logind[1599]: Removed session 16. Jul 9 13:03:43.547130 containerd[1624]: time="2025-07-09T13:03:43.547076448Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569\" id:\"1919bd5735e57a6dfb95060d42b5f9f812ad252f7cefeefc7d0ddb9475b69663\" pid:5671 exited_at:{seconds:1752066223 nanos:546810416}" Jul 9 13:03:46.756632 systemd[1]: Started sshd@14-139.178.70.105:22-139.178.68.195:45700.service - OpenSSH per-connection server daemon (139.178.68.195:45700). Jul 9 13:03:47.260217 sshd[5703]: Accepted publickey for core from 139.178.68.195 port 45700 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:47.274358 sshd-session[5703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:47.277464 systemd-logind[1599]: New session 17 of user core. Jul 9 13:03:47.284882 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 9 13:03:49.215471 containerd[1624]: time="2025-07-09T13:03:49.215436004Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\" id:\"175a398df93218cd7da5724f90e4ce80d2e3762968127eba53f99c3707ae06d9\" pid:5693 exited_at:{seconds:1752066229 nanos:200369731}" Jul 9 13:03:49.894558 sshd[5706]: Connection closed by 139.178.68.195 port 45700 Jul 9 13:03:49.956834 sshd-session[5703]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:49.964398 systemd[1]: sshd@14-139.178.70.105:22-139.178.68.195:45700.service: Deactivated successfully. Jul 9 13:03:49.967037 systemd[1]: session-17.scope: Deactivated successfully. Jul 9 13:03:49.969421 systemd-logind[1599]: Session 17 logged out. Waiting for processes to exit. Jul 9 13:03:49.970647 systemd-logind[1599]: Removed session 17. Jul 9 13:03:52.555549 containerd[1624]: time="2025-07-09T13:03:52.555509660Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f\" id:\"779656da2a50267551ab1c38339b9c6cafeba20241650dfa97492625d87c3cfb\" pid:5741 exit_status:1 exited_at:{seconds:1752066232 nanos:555184798}" Jul 9 13:03:54.929945 systemd[1]: Started sshd@15-139.178.70.105:22-139.178.68.195:51428.service - OpenSSH per-connection server daemon (139.178.68.195:51428). Jul 9 13:03:55.034313 sshd[5753]: Accepted publickey for core from 139.178.68.195 port 51428 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:03:55.035735 sshd-session[5753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:03:55.039717 systemd-logind[1599]: New session 18 of user core. Jul 9 13:03:55.048804 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 9 13:03:55.468958 sshd[5756]: Connection closed by 139.178.68.195 port 51428 Jul 9 13:03:55.468464 sshd-session[5753]: pam_unix(sshd:session): session closed for user core Jul 9 13:03:55.470454 systemd[1]: sshd@15-139.178.70.105:22-139.178.68.195:51428.service: Deactivated successfully. Jul 9 13:03:55.471605 systemd[1]: session-18.scope: Deactivated successfully. Jul 9 13:03:55.472114 systemd-logind[1599]: Session 18 logged out. Waiting for processes to exit. Jul 9 13:03:55.473416 systemd-logind[1599]: Removed session 18. Jul 9 13:04:00.480590 systemd[1]: Started sshd@16-139.178.70.105:22-139.178.68.195:45286.service - OpenSSH per-connection server daemon (139.178.68.195:45286). Jul 9 13:04:00.542423 sshd[5770]: Accepted publickey for core from 139.178.68.195 port 45286 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:04:00.543449 sshd-session[5770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:04:00.546711 systemd-logind[1599]: New session 19 of user core. Jul 9 13:04:00.549776 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 9 13:04:00.762391 sshd[5773]: Connection closed by 139.178.68.195 port 45286 Jul 9 13:04:00.769423 systemd[1]: sshd@16-139.178.70.105:22-139.178.68.195:45286.service: Deactivated successfully. Jul 9 13:04:00.763511 sshd-session[5770]: pam_unix(sshd:session): session closed for user core Jul 9 13:04:00.770933 systemd[1]: session-19.scope: Deactivated successfully. Jul 9 13:04:00.771985 systemd-logind[1599]: Session 19 logged out. Waiting for processes to exit. Jul 9 13:04:00.774921 systemd[1]: Started sshd@17-139.178.70.105:22-139.178.68.195:45294.service - OpenSSH per-connection server daemon (139.178.68.195:45294). Jul 9 13:04:00.775394 systemd-logind[1599]: Removed session 19. Jul 9 13:04:00.828966 sshd[5785]: Accepted publickey for core from 139.178.68.195 port 45294 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:04:00.829988 sshd-session[5785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:04:00.833492 systemd-logind[1599]: New session 20 of user core. Jul 9 13:04:00.845843 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 9 13:04:01.712337 sshd[5788]: Connection closed by 139.178.68.195 port 45294 Jul 9 13:04:01.713755 sshd-session[5785]: pam_unix(sshd:session): session closed for user core Jul 9 13:04:01.720774 systemd[1]: sshd@17-139.178.70.105:22-139.178.68.195:45294.service: Deactivated successfully. Jul 9 13:04:01.722538 systemd[1]: session-20.scope: Deactivated successfully. Jul 9 13:04:01.723534 systemd-logind[1599]: Session 20 logged out. Waiting for processes to exit. Jul 9 13:04:01.725360 systemd[1]: Started sshd@18-139.178.70.105:22-139.178.68.195:45308.service - OpenSSH per-connection server daemon (139.178.68.195:45308). Jul 9 13:04:01.726267 systemd-logind[1599]: Removed session 20. Jul 9 13:04:01.806319 sshd[5798]: Accepted publickey for core from 139.178.68.195 port 45308 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:04:01.807204 sshd-session[5798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:04:01.810658 systemd-logind[1599]: New session 21 of user core. Jul 9 13:04:01.814799 systemd[1]: Started session-21.scope - Session 21 of User core. Jul 9 13:04:03.446757 sshd[5801]: Connection closed by 139.178.68.195 port 45308 Jul 9 13:04:03.455309 systemd[1]: Started sshd@19-139.178.70.105:22-139.178.68.195:45320.service - OpenSSH per-connection server daemon (139.178.68.195:45320). Jul 9 13:04:03.483628 sshd-session[5798]: pam_unix(sshd:session): session closed for user core Jul 9 13:04:03.498711 systemd[1]: sshd@18-139.178.70.105:22-139.178.68.195:45308.service: Deactivated successfully. Jul 9 13:04:03.501549 systemd[1]: session-21.scope: Deactivated successfully. Jul 9 13:04:03.504494 systemd-logind[1599]: Session 21 logged out. Waiting for processes to exit. Jul 9 13:04:03.506828 systemd-logind[1599]: Removed session 21. Jul 9 13:04:03.609779 sshd[5815]: Accepted publickey for core from 139.178.68.195 port 45320 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:04:03.614406 sshd-session[5815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:04:03.620126 systemd-logind[1599]: New session 22 of user core. Jul 9 13:04:03.627251 systemd[1]: Started session-22.scope - Session 22 of User core. Jul 9 13:04:04.219345 containerd[1624]: time="2025-07-09T13:04:04.219300820Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91c43d9828a72b315bc23df3650bc48caa348d059d9504578414d166525eec7f\" id:\"7977bbd4efd82d43833e16bfd66df856dcd1d34e513679c84570b1ed35e8d9d8\" pid:5833 exited_at:{seconds:1752066244 nanos:219071398}" Jul 9 13:04:05.786505 sshd[5844]: Connection closed by 139.178.68.195 port 45320 Jul 9 13:04:05.786656 sshd-session[5815]: pam_unix(sshd:session): session closed for user core Jul 9 13:04:05.794195 systemd[1]: Started sshd@20-139.178.70.105:22-139.178.68.195:45328.service - OpenSSH per-connection server daemon (139.178.68.195:45328). Jul 9 13:04:05.796111 systemd[1]: sshd@19-139.178.70.105:22-139.178.68.195:45320.service: Deactivated successfully. Jul 9 13:04:05.797555 systemd[1]: session-22.scope: Deactivated successfully. Jul 9 13:04:05.798105 systemd-logind[1599]: Session 22 logged out. Waiting for processes to exit. Jul 9 13:04:05.798992 systemd-logind[1599]: Removed session 22. Jul 9 13:04:06.103655 sshd[5860]: Accepted publickey for core from 139.178.68.195 port 45328 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:04:06.104856 sshd-session[5860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:04:06.109356 systemd-logind[1599]: New session 23 of user core. Jul 9 13:04:06.118866 systemd[1]: Started session-23.scope - Session 23 of User core. Jul 9 13:04:06.340455 sshd[5866]: Connection closed by 139.178.68.195 port 45328 Jul 9 13:04:06.340883 sshd-session[5860]: pam_unix(sshd:session): session closed for user core Jul 9 13:04:06.343820 systemd-logind[1599]: Session 23 logged out. Waiting for processes to exit. Jul 9 13:04:06.343986 systemd[1]: sshd@20-139.178.70.105:22-139.178.68.195:45328.service: Deactivated successfully. Jul 9 13:04:06.345803 systemd[1]: session-23.scope: Deactivated successfully. Jul 9 13:04:06.347863 systemd-logind[1599]: Removed session 23. Jul 9 13:04:11.355000 systemd[1]: Started sshd@21-139.178.70.105:22-139.178.68.195:34278.service - OpenSSH per-connection server daemon (139.178.68.195:34278). Jul 9 13:04:11.604865 sshd[5879]: Accepted publickey for core from 139.178.68.195 port 34278 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:04:11.605864 sshd-session[5879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:04:11.609499 systemd-logind[1599]: New session 24 of user core. Jul 9 13:04:11.614803 systemd[1]: Started session-24.scope - Session 24 of User core. Jul 9 13:04:11.924077 sshd[5882]: Connection closed by 139.178.68.195 port 34278 Jul 9 13:04:11.930259 sshd-session[5879]: pam_unix(sshd:session): session closed for user core Jul 9 13:04:11.933596 systemd[1]: sshd@21-139.178.70.105:22-139.178.68.195:34278.service: Deactivated successfully. Jul 9 13:04:11.934957 systemd[1]: session-24.scope: Deactivated successfully. Jul 9 13:04:11.935709 systemd-logind[1599]: Session 24 logged out. Waiting for processes to exit. Jul 9 13:04:11.936417 systemd-logind[1599]: Removed session 24. Jul 9 13:04:13.559556 containerd[1624]: time="2025-07-09T13:04:13.559520149Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ba03167d8f6110ca9e26341c3bc304c76c5c0fd37890dc30bb7992084a7bc569\" id:\"ee22da31e1474547cf50bd8088fc5d6c8e704e452a43e39bbe25924253c4f2df\" pid:5911 exited_at:{seconds:1752066253 nanos:555580797}" Jul 9 13:04:16.936645 systemd[1]: Started sshd@22-139.178.70.105:22-139.178.68.195:34290.service - OpenSSH per-connection server daemon (139.178.68.195:34290). Jul 9 13:04:17.064536 sshd[5924]: Accepted publickey for core from 139.178.68.195 port 34290 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:04:17.076641 sshd-session[5924]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:04:17.083727 systemd-logind[1599]: New session 25 of user core. Jul 9 13:04:17.090803 systemd[1]: Started session-25.scope - Session 25 of User core. Jul 9 13:04:17.464695 sshd[5927]: Connection closed by 139.178.68.195 port 34290 Jul 9 13:04:17.465261 sshd-session[5924]: pam_unix(sshd:session): session closed for user core Jul 9 13:04:17.468351 systemd-logind[1599]: Session 25 logged out. Waiting for processes to exit. Jul 9 13:04:17.468721 systemd[1]: sshd@22-139.178.70.105:22-139.178.68.195:34290.service: Deactivated successfully. Jul 9 13:04:17.470002 systemd[1]: session-25.scope: Deactivated successfully. Jul 9 13:04:17.471176 systemd-logind[1599]: Removed session 25. Jul 9 13:04:22.474241 systemd[1]: Started sshd@23-139.178.70.105:22-139.178.68.195:48878.service - OpenSSH per-connection server daemon (139.178.68.195:48878). Jul 9 13:04:22.781156 sshd[5962]: Accepted publickey for core from 139.178.68.195 port 48878 ssh2: RSA SHA256:pHehh7tc90QOyf1uGohWVF4tJIie1SMOFA2c8G1DmZI Jul 9 13:04:22.783085 sshd-session[5962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 13:04:22.787330 systemd-logind[1599]: New session 26 of user core. Jul 9 13:04:22.795738 systemd[1]: Started session-26.scope - Session 26 of User core. Jul 9 13:04:23.161889 containerd[1624]: time="2025-07-09T13:04:23.161509240Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e77ceae0c93cfcb1c385d466afd051672a8ee5655ed0c189ca239303d947007f\" id:\"31d2fd406aa84888df6df6f2e3af72e7de74027150f30d14869df19c40e70ca4\" pid:5951 exited_at:{seconds:1752066263 nanos:161322938}" Jul 9 13:04:23.814683 sshd[5965]: Connection closed by 139.178.68.195 port 48878 Jul 9 13:04:23.815911 sshd-session[5962]: pam_unix(sshd:session): session closed for user core Jul 9 13:04:23.818202 systemd[1]: sshd@23-139.178.70.105:22-139.178.68.195:48878.service: Deactivated successfully. Jul 9 13:04:23.819667 systemd[1]: session-26.scope: Deactivated successfully. Jul 9 13:04:23.831135 systemd-logind[1599]: Session 26 logged out. Waiting for processes to exit. Jul 9 13:04:23.832177 systemd-logind[1599]: Removed session 26.