Jul 15 05:10:45.718813 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Jul 15 03:28:48 -00 2025 Jul 15 05:10:45.718830 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:10:45.718836 kernel: Disabled fast string operations Jul 15 05:10:45.718841 kernel: BIOS-provided physical RAM map: Jul 15 05:10:45.718845 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jul 15 05:10:45.718849 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jul 15 05:10:45.718855 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jul 15 05:10:45.718859 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jul 15 05:10:45.718863 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jul 15 05:10:45.718867 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jul 15 05:10:45.718872 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jul 15 05:10:45.718876 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jul 15 05:10:45.718880 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jul 15 05:10:45.718884 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jul 15 05:10:45.718891 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jul 15 05:10:45.718895 kernel: NX (Execute Disable) protection: active Jul 15 05:10:45.718900 kernel: APIC: Static calls initialized Jul 15 05:10:45.718905 kernel: SMBIOS 2.7 present. Jul 15 05:10:45.718910 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jul 15 05:10:45.718915 kernel: DMI: Memory slots populated: 1/128 Jul 15 05:10:45.718920 kernel: vmware: hypercall mode: 0x00 Jul 15 05:10:45.718925 kernel: Hypervisor detected: VMware Jul 15 05:10:45.718930 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jul 15 05:10:45.718934 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jul 15 05:10:45.718939 kernel: vmware: using clock offset of 6013250671 ns Jul 15 05:10:45.718944 kernel: tsc: Detected 3408.000 MHz processor Jul 15 05:10:45.718949 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jul 15 05:10:45.718954 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jul 15 05:10:45.718959 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jul 15 05:10:45.718964 kernel: total RAM covered: 3072M Jul 15 05:10:45.718970 kernel: Found optimal setting for mtrr clean up Jul 15 05:10:45.718976 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jul 15 05:10:45.718981 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jul 15 05:10:45.718985 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jul 15 05:10:45.718990 kernel: Using GB pages for direct mapping Jul 15 05:10:45.718995 kernel: ACPI: Early table checksum verification disabled Jul 15 05:10:45.719000 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jul 15 05:10:45.719005 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jul 15 05:10:45.719010 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jul 15 05:10:45.719016 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jul 15 05:10:45.719023 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jul 15 05:10:45.719032 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jul 15 05:10:45.719038 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jul 15 05:10:45.719043 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jul 15 05:10:45.719051 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jul 15 05:10:45.719060 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jul 15 05:10:45.719065 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jul 15 05:10:45.719070 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jul 15 05:10:45.719075 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jul 15 05:10:45.719081 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jul 15 05:10:45.719086 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jul 15 05:10:45.719093 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jul 15 05:10:45.719098 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jul 15 05:10:45.719103 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jul 15 05:10:45.719110 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jul 15 05:10:45.719117 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jul 15 05:10:45.719125 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jul 15 05:10:45.719130 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jul 15 05:10:45.719135 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jul 15 05:10:45.719142 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jul 15 05:10:45.719147 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jul 15 05:10:45.719153 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Jul 15 05:10:45.719158 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Jul 15 05:10:45.719164 kernel: Zone ranges: Jul 15 05:10:45.719170 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jul 15 05:10:45.719175 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jul 15 05:10:45.719180 kernel: Normal empty Jul 15 05:10:45.719185 kernel: Device empty Jul 15 05:10:45.719190 kernel: Movable zone start for each node Jul 15 05:10:45.719195 kernel: Early memory node ranges Jul 15 05:10:45.719200 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jul 15 05:10:45.719205 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jul 15 05:10:45.719210 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jul 15 05:10:45.719216 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jul 15 05:10:45.719221 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jul 15 05:10:45.719227 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jul 15 05:10:45.719232 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jul 15 05:10:45.719237 kernel: ACPI: PM-Timer IO Port: 0x1008 Jul 15 05:10:45.719242 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jul 15 05:10:45.719247 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jul 15 05:10:45.719252 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jul 15 05:10:45.719257 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jul 15 05:10:45.719263 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jul 15 05:10:45.719268 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jul 15 05:10:45.719273 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jul 15 05:10:45.719278 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jul 15 05:10:45.719283 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jul 15 05:10:45.719288 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jul 15 05:10:45.719293 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jul 15 05:10:45.719298 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jul 15 05:10:45.719303 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jul 15 05:10:45.719309 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jul 15 05:10:45.719314 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jul 15 05:10:45.719319 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jul 15 05:10:45.719324 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jul 15 05:10:45.719329 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jul 15 05:10:45.719334 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jul 15 05:10:45.719339 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jul 15 05:10:45.719344 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jul 15 05:10:45.719349 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jul 15 05:10:45.719354 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jul 15 05:10:45.719360 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jul 15 05:10:45.719365 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jul 15 05:10:45.719370 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jul 15 05:10:45.719375 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jul 15 05:10:45.719380 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jul 15 05:10:45.719385 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jul 15 05:10:45.719390 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jul 15 05:10:45.719395 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jul 15 05:10:45.719400 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jul 15 05:10:45.719405 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jul 15 05:10:45.719411 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jul 15 05:10:45.719416 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jul 15 05:10:45.719421 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jul 15 05:10:45.719426 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jul 15 05:10:45.719431 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jul 15 05:10:45.719436 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jul 15 05:10:45.719442 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jul 15 05:10:45.719451 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jul 15 05:10:45.719456 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jul 15 05:10:45.719461 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jul 15 05:10:45.719468 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jul 15 05:10:45.719473 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jul 15 05:10:45.719478 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jul 15 05:10:45.719484 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jul 15 05:10:45.719489 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jul 15 05:10:45.719494 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jul 15 05:10:45.719500 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jul 15 05:10:45.719505 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jul 15 05:10:45.719511 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jul 15 05:10:45.719517 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jul 15 05:10:45.719522 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jul 15 05:10:45.719527 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jul 15 05:10:45.719533 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jul 15 05:10:45.719538 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jul 15 05:10:45.719543 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jul 15 05:10:45.719548 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jul 15 05:10:45.719553 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jul 15 05:10:45.719561 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jul 15 05:10:45.719566 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jul 15 05:10:45.719572 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jul 15 05:10:45.719577 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jul 15 05:10:45.719582 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jul 15 05:10:45.719587 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jul 15 05:10:45.719593 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jul 15 05:10:45.719598 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jul 15 05:10:45.719603 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jul 15 05:10:45.719609 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jul 15 05:10:45.719615 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jul 15 05:10:45.719620 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jul 15 05:10:45.719625 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jul 15 05:10:45.719631 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jul 15 05:10:45.719636 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jul 15 05:10:45.719641 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jul 15 05:10:45.719647 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jul 15 05:10:45.719652 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jul 15 05:10:45.719657 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jul 15 05:10:45.719663 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jul 15 05:10:45.719670 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jul 15 05:10:45.719675 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jul 15 05:10:45.719680 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jul 15 05:10:45.719686 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jul 15 05:10:45.720755 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jul 15 05:10:45.720763 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jul 15 05:10:45.720769 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jul 15 05:10:45.720774 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jul 15 05:10:45.720780 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jul 15 05:10:45.720785 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jul 15 05:10:45.720793 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jul 15 05:10:45.720799 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jul 15 05:10:45.720805 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jul 15 05:10:45.720810 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jul 15 05:10:45.720815 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jul 15 05:10:45.720821 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jul 15 05:10:45.720826 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jul 15 05:10:45.720831 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jul 15 05:10:45.720837 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jul 15 05:10:45.720843 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jul 15 05:10:45.720849 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jul 15 05:10:45.720854 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jul 15 05:10:45.720859 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jul 15 05:10:45.720865 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jul 15 05:10:45.720870 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jul 15 05:10:45.720876 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jul 15 05:10:45.720881 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jul 15 05:10:45.720886 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jul 15 05:10:45.720892 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jul 15 05:10:45.720898 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jul 15 05:10:45.720904 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jul 15 05:10:45.720909 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jul 15 05:10:45.720914 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jul 15 05:10:45.720920 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jul 15 05:10:45.720925 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jul 15 05:10:45.720930 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jul 15 05:10:45.720936 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jul 15 05:10:45.720941 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jul 15 05:10:45.720948 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jul 15 05:10:45.720953 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jul 15 05:10:45.720959 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jul 15 05:10:45.720964 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jul 15 05:10:45.720969 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jul 15 05:10:45.720975 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jul 15 05:10:45.720980 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jul 15 05:10:45.720986 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jul 15 05:10:45.720995 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jul 15 05:10:45.721001 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jul 15 05:10:45.721008 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jul 15 05:10:45.721014 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jul 15 05:10:45.721022 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jul 15 05:10:45.721028 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jul 15 05:10:45.721033 kernel: TSC deadline timer available Jul 15 05:10:45.721039 kernel: CPU topo: Max. logical packages: 128 Jul 15 05:10:45.721045 kernel: CPU topo: Max. logical dies: 128 Jul 15 05:10:45.721050 kernel: CPU topo: Max. dies per package: 1 Jul 15 05:10:45.721055 kernel: CPU topo: Max. threads per core: 1 Jul 15 05:10:45.721061 kernel: CPU topo: Num. cores per package: 1 Jul 15 05:10:45.721067 kernel: CPU topo: Num. threads per package: 1 Jul 15 05:10:45.721073 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Jul 15 05:10:45.721078 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jul 15 05:10:45.721084 kernel: Booting paravirtualized kernel on VMware hypervisor Jul 15 05:10:45.721089 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jul 15 05:10:45.721095 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jul 15 05:10:45.721101 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Jul 15 05:10:45.721106 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Jul 15 05:10:45.721112 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jul 15 05:10:45.721118 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jul 15 05:10:45.721124 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jul 15 05:10:45.721129 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jul 15 05:10:45.721135 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jul 15 05:10:45.721140 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jul 15 05:10:45.721145 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jul 15 05:10:45.721151 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jul 15 05:10:45.721156 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jul 15 05:10:45.721162 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jul 15 05:10:45.721168 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jul 15 05:10:45.721173 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jul 15 05:10:45.721179 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jul 15 05:10:45.721184 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jul 15 05:10:45.721189 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jul 15 05:10:45.721195 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jul 15 05:10:45.721201 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:10:45.721207 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 15 05:10:45.721214 kernel: random: crng init done Jul 15 05:10:45.721219 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jul 15 05:10:45.721225 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jul 15 05:10:45.721230 kernel: printk: log_buf_len min size: 262144 bytes Jul 15 05:10:45.721236 kernel: printk: log_buf_len: 1048576 bytes Jul 15 05:10:45.721241 kernel: printk: early log buf free: 245592(93%) Jul 15 05:10:45.721247 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 15 05:10:45.721252 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jul 15 05:10:45.721258 kernel: Fallback order for Node 0: 0 Jul 15 05:10:45.721264 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Jul 15 05:10:45.721270 kernel: Policy zone: DMA32 Jul 15 05:10:45.721275 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 15 05:10:45.721281 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jul 15 05:10:45.721290 kernel: ftrace: allocating 40097 entries in 157 pages Jul 15 05:10:45.721298 kernel: ftrace: allocated 157 pages with 5 groups Jul 15 05:10:45.721304 kernel: Dynamic Preempt: voluntary Jul 15 05:10:45.721310 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 15 05:10:45.721317 kernel: rcu: RCU event tracing is enabled. Jul 15 05:10:45.721324 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jul 15 05:10:45.721331 kernel: Trampoline variant of Tasks RCU enabled. Jul 15 05:10:45.721337 kernel: Rude variant of Tasks RCU enabled. Jul 15 05:10:45.721343 kernel: Tracing variant of Tasks RCU enabled. Jul 15 05:10:45.721348 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 15 05:10:45.721353 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jul 15 05:10:45.721359 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jul 15 05:10:45.721364 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jul 15 05:10:45.721370 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jul 15 05:10:45.721377 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jul 15 05:10:45.721382 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jul 15 05:10:45.721388 kernel: Console: colour VGA+ 80x25 Jul 15 05:10:45.721393 kernel: printk: legacy console [tty0] enabled Jul 15 05:10:45.721398 kernel: printk: legacy console [ttyS0] enabled Jul 15 05:10:45.721404 kernel: ACPI: Core revision 20240827 Jul 15 05:10:45.721409 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jul 15 05:10:45.721415 kernel: APIC: Switch to symmetric I/O mode setup Jul 15 05:10:45.721420 kernel: x2apic enabled Jul 15 05:10:45.721427 kernel: APIC: Switched APIC routing to: physical x2apic Jul 15 05:10:45.721432 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jul 15 05:10:45.721438 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jul 15 05:10:45.721444 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jul 15 05:10:45.721449 kernel: Disabled fast string operations Jul 15 05:10:45.721454 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jul 15 05:10:45.721460 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Jul 15 05:10:45.721465 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jul 15 05:10:45.721471 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jul 15 05:10:45.721477 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jul 15 05:10:45.721483 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jul 15 05:10:45.721488 kernel: RETBleed: Mitigation: Enhanced IBRS Jul 15 05:10:45.721494 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jul 15 05:10:45.721500 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jul 15 05:10:45.721505 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jul 15 05:10:45.721511 kernel: SRBDS: Unknown: Dependent on hypervisor status Jul 15 05:10:45.721516 kernel: GDS: Unknown: Dependent on hypervisor status Jul 15 05:10:45.721522 kernel: ITS: Mitigation: Aligned branch/return thunks Jul 15 05:10:45.721528 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jul 15 05:10:45.721534 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jul 15 05:10:45.721539 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jul 15 05:10:45.721545 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jul 15 05:10:45.721550 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jul 15 05:10:45.721556 kernel: Freeing SMP alternatives memory: 32K Jul 15 05:10:45.721561 kernel: pid_max: default: 131072 minimum: 1024 Jul 15 05:10:45.721567 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 15 05:10:45.721572 kernel: landlock: Up and running. Jul 15 05:10:45.721579 kernel: SELinux: Initializing. Jul 15 05:10:45.721584 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 15 05:10:45.721590 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 15 05:10:45.721595 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jul 15 05:10:45.721601 kernel: Performance Events: Skylake events, core PMU driver. Jul 15 05:10:45.721606 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jul 15 05:10:45.721612 kernel: core: CPUID marked event: 'instructions' unavailable Jul 15 05:10:45.721617 kernel: core: CPUID marked event: 'bus cycles' unavailable Jul 15 05:10:45.721622 kernel: core: CPUID marked event: 'cache references' unavailable Jul 15 05:10:45.721629 kernel: core: CPUID marked event: 'cache misses' unavailable Jul 15 05:10:45.721634 kernel: core: CPUID marked event: 'branch instructions' unavailable Jul 15 05:10:45.721640 kernel: core: CPUID marked event: 'branch misses' unavailable Jul 15 05:10:45.721645 kernel: ... version: 1 Jul 15 05:10:45.721651 kernel: ... bit width: 48 Jul 15 05:10:45.721656 kernel: ... generic registers: 4 Jul 15 05:10:45.721661 kernel: ... value mask: 0000ffffffffffff Jul 15 05:10:45.721667 kernel: ... max period: 000000007fffffff Jul 15 05:10:45.721674 kernel: ... fixed-purpose events: 0 Jul 15 05:10:45.721679 kernel: ... event mask: 000000000000000f Jul 15 05:10:45.721687 kernel: signal: max sigframe size: 1776 Jul 15 05:10:45.723077 kernel: rcu: Hierarchical SRCU implementation. Jul 15 05:10:45.723085 kernel: rcu: Max phase no-delay instances is 400. Jul 15 05:10:45.723091 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Jul 15 05:10:45.723097 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jul 15 05:10:45.723102 kernel: smp: Bringing up secondary CPUs ... Jul 15 05:10:45.723108 kernel: smpboot: x86: Booting SMP configuration: Jul 15 05:10:45.723114 kernel: .... node #0, CPUs: #1 Jul 15 05:10:45.723122 kernel: Disabled fast string operations Jul 15 05:10:45.723128 kernel: smp: Brought up 1 node, 2 CPUs Jul 15 05:10:45.723133 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jul 15 05:10:45.723139 kernel: Memory: 1924236K/2096628K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54608K init, 2360K bss, 161008K reserved, 0K cma-reserved) Jul 15 05:10:45.723145 kernel: devtmpfs: initialized Jul 15 05:10:45.723150 kernel: x86/mm: Memory block size: 128MB Jul 15 05:10:45.723156 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jul 15 05:10:45.723162 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 15 05:10:45.723168 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jul 15 05:10:45.723175 kernel: pinctrl core: initialized pinctrl subsystem Jul 15 05:10:45.723180 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 15 05:10:45.723186 kernel: audit: initializing netlink subsys (disabled) Jul 15 05:10:45.723191 kernel: audit: type=2000 audit(1752556242.322:1): state=initialized audit_enabled=0 res=1 Jul 15 05:10:45.723197 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 15 05:10:45.723203 kernel: thermal_sys: Registered thermal governor 'user_space' Jul 15 05:10:45.723208 kernel: cpuidle: using governor menu Jul 15 05:10:45.723214 kernel: Simple Boot Flag at 0x36 set to 0x80 Jul 15 05:10:45.723219 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 15 05:10:45.723226 kernel: dca service started, version 1.12.1 Jul 15 05:10:45.723231 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Jul 15 05:10:45.723245 kernel: PCI: Using configuration type 1 for base access Jul 15 05:10:45.723252 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jul 15 05:10:45.723258 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 15 05:10:45.723264 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jul 15 05:10:45.723269 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 15 05:10:45.723275 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jul 15 05:10:45.723281 kernel: ACPI: Added _OSI(Module Device) Jul 15 05:10:45.723288 kernel: ACPI: Added _OSI(Processor Device) Jul 15 05:10:45.723295 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 15 05:10:45.723301 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 15 05:10:45.723307 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jul 15 05:10:45.723312 kernel: ACPI: Interpreter enabled Jul 15 05:10:45.723318 kernel: ACPI: PM: (supports S0 S1 S5) Jul 15 05:10:45.723324 kernel: ACPI: Using IOAPIC for interrupt routing Jul 15 05:10:45.723330 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jul 15 05:10:45.723336 kernel: PCI: Using E820 reservations for host bridge windows Jul 15 05:10:45.723343 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jul 15 05:10:45.723349 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jul 15 05:10:45.723467 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 15 05:10:45.723546 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jul 15 05:10:45.723601 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jul 15 05:10:45.723611 kernel: PCI host bridge to bus 0000:00 Jul 15 05:10:45.723665 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jul 15 05:10:45.724485 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jul 15 05:10:45.724532 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jul 15 05:10:45.724576 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jul 15 05:10:45.724620 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jul 15 05:10:45.724852 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jul 15 05:10:45.724921 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Jul 15 05:10:45.724986 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Jul 15 05:10:45.725041 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jul 15 05:10:45.725096 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Jul 15 05:10:45.725154 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Jul 15 05:10:45.725208 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Jul 15 05:10:45.725258 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Jul 15 05:10:45.725309 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Jul 15 05:10:45.725368 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Jul 15 05:10:45.725419 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Jul 15 05:10:45.725475 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Jul 15 05:10:45.725528 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jul 15 05:10:45.725577 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jul 15 05:10:45.725646 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Jul 15 05:10:45.726778 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Jul 15 05:10:45.726842 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Jul 15 05:10:45.726901 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Jul 15 05:10:45.726955 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Jul 15 05:10:45.727009 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Jul 15 05:10:45.727060 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Jul 15 05:10:45.727111 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Jul 15 05:10:45.727162 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jul 15 05:10:45.727219 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Jul 15 05:10:45.727757 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jul 15 05:10:45.727822 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jul 15 05:10:45.727874 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jul 15 05:10:45.727925 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jul 15 05:10:45.727980 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.728034 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jul 15 05:10:45.728085 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jul 15 05:10:45.728136 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jul 15 05:10:45.728186 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.728244 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.728297 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jul 15 05:10:45.728349 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jul 15 05:10:45.728403 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jul 15 05:10:45.728467 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jul 15 05:10:45.728519 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.728575 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.728631 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jul 15 05:10:45.730284 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jul 15 05:10:45.730349 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jul 15 05:10:45.730405 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jul 15 05:10:45.730466 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.730528 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.730598 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jul 15 05:10:45.730650 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jul 15 05:10:45.730727 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jul 15 05:10:45.730782 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.730852 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.730905 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jul 15 05:10:45.730963 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jul 15 05:10:45.731020 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jul 15 05:10:45.731071 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.731127 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.731179 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jul 15 05:10:45.731229 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jul 15 05:10:45.731279 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jul 15 05:10:45.731333 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.731402 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.731455 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jul 15 05:10:45.731507 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jul 15 05:10:45.731567 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jul 15 05:10:45.731619 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.731673 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.733761 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jul 15 05:10:45.733826 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jul 15 05:10:45.733880 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jul 15 05:10:45.733931 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.733992 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.734044 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jul 15 05:10:45.734095 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jul 15 05:10:45.734146 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jul 15 05:10:45.734198 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.734252 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.734303 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jul 15 05:10:45.734353 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jul 15 05:10:45.734409 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jul 15 05:10:45.734468 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jul 15 05:10:45.734531 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.734587 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.734641 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jul 15 05:10:45.736705 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jul 15 05:10:45.736774 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jul 15 05:10:45.736839 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jul 15 05:10:45.736895 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.736953 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.737018 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jul 15 05:10:45.737073 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jul 15 05:10:45.737159 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jul 15 05:10:45.737217 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.737276 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.737338 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jul 15 05:10:45.737390 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jul 15 05:10:45.737440 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jul 15 05:10:45.737493 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.737553 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.737614 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jul 15 05:10:45.737665 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jul 15 05:10:45.737728 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jul 15 05:10:45.737779 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.737834 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.737895 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jul 15 05:10:45.737951 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jul 15 05:10:45.738003 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jul 15 05:10:45.738053 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.738107 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.738158 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jul 15 05:10:45.738208 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jul 15 05:10:45.738268 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jul 15 05:10:45.738333 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.738392 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.738443 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jul 15 05:10:45.738493 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jul 15 05:10:45.738543 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jul 15 05:10:45.738592 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jul 15 05:10:45.738654 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.741558 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.741654 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jul 15 05:10:45.741724 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jul 15 05:10:45.741802 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jul 15 05:10:45.741862 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jul 15 05:10:45.741926 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.741983 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.742034 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jul 15 05:10:45.742085 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jul 15 05:10:45.742134 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jul 15 05:10:45.742188 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jul 15 05:10:45.742248 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.742304 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.742363 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jul 15 05:10:45.742415 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jul 15 05:10:45.742465 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jul 15 05:10:45.742516 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.742575 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.742627 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jul 15 05:10:45.742680 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jul 15 05:10:45.742744 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jul 15 05:10:45.742805 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.742867 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.742919 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jul 15 05:10:45.742973 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jul 15 05:10:45.743022 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jul 15 05:10:45.743072 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.743133 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.743189 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jul 15 05:10:45.743239 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jul 15 05:10:45.743299 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jul 15 05:10:45.743357 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.743416 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.744109 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jul 15 05:10:45.744814 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jul 15 05:10:45.744879 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jul 15 05:10:45.744933 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.744990 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.745046 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jul 15 05:10:45.745097 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jul 15 05:10:45.745152 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jul 15 05:10:45.745204 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jul 15 05:10:45.745254 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.745312 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.745363 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jul 15 05:10:45.745426 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jul 15 05:10:45.745477 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jul 15 05:10:45.745527 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jul 15 05:10:45.745581 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.745642 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.745721 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jul 15 05:10:45.745775 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jul 15 05:10:45.745829 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jul 15 05:10:45.745879 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.745940 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.745995 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jul 15 05:10:45.746045 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jul 15 05:10:45.746105 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jul 15 05:10:45.746160 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.746219 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.746277 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jul 15 05:10:45.746328 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jul 15 05:10:45.746378 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jul 15 05:10:45.746437 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.746495 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.746551 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jul 15 05:10:45.746603 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jul 15 05:10:45.746656 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jul 15 05:10:45.746730 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.746788 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.746839 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jul 15 05:10:45.746890 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jul 15 05:10:45.746940 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jul 15 05:10:45.746997 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.747059 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Jul 15 05:10:45.747110 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jul 15 05:10:45.747160 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jul 15 05:10:45.747210 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jul 15 05:10:45.747260 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.747331 kernel: pci_bus 0000:01: extended config space not accessible Jul 15 05:10:45.747397 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jul 15 05:10:45.747456 kernel: pci_bus 0000:02: extended config space not accessible Jul 15 05:10:45.747465 kernel: acpiphp: Slot [32] registered Jul 15 05:10:45.747472 kernel: acpiphp: Slot [33] registered Jul 15 05:10:45.747478 kernel: acpiphp: Slot [34] registered Jul 15 05:10:45.747484 kernel: acpiphp: Slot [35] registered Jul 15 05:10:45.747490 kernel: acpiphp: Slot [36] registered Jul 15 05:10:45.747496 kernel: acpiphp: Slot [37] registered Jul 15 05:10:45.747502 kernel: acpiphp: Slot [38] registered Jul 15 05:10:45.747508 kernel: acpiphp: Slot [39] registered Jul 15 05:10:45.747516 kernel: acpiphp: Slot [40] registered Jul 15 05:10:45.747522 kernel: acpiphp: Slot [41] registered Jul 15 05:10:45.747527 kernel: acpiphp: Slot [42] registered Jul 15 05:10:45.747533 kernel: acpiphp: Slot [43] registered Jul 15 05:10:45.747539 kernel: acpiphp: Slot [44] registered Jul 15 05:10:45.747544 kernel: acpiphp: Slot [45] registered Jul 15 05:10:45.747550 kernel: acpiphp: Slot [46] registered Jul 15 05:10:45.747556 kernel: acpiphp: Slot [47] registered Jul 15 05:10:45.749707 kernel: acpiphp: Slot [48] registered Jul 15 05:10:45.749727 kernel: acpiphp: Slot [49] registered Jul 15 05:10:45.749737 kernel: acpiphp: Slot [50] registered Jul 15 05:10:45.749743 kernel: acpiphp: Slot [51] registered Jul 15 05:10:45.749749 kernel: acpiphp: Slot [52] registered Jul 15 05:10:45.749758 kernel: acpiphp: Slot [53] registered Jul 15 05:10:45.749764 kernel: acpiphp: Slot [54] registered Jul 15 05:10:45.749770 kernel: acpiphp: Slot [55] registered Jul 15 05:10:45.749776 kernel: acpiphp: Slot [56] registered Jul 15 05:10:45.749784 kernel: acpiphp: Slot [57] registered Jul 15 05:10:45.749791 kernel: acpiphp: Slot [58] registered Jul 15 05:10:45.749798 kernel: acpiphp: Slot [59] registered Jul 15 05:10:45.749804 kernel: acpiphp: Slot [60] registered Jul 15 05:10:45.749810 kernel: acpiphp: Slot [61] registered Jul 15 05:10:45.749819 kernel: acpiphp: Slot [62] registered Jul 15 05:10:45.749825 kernel: acpiphp: Slot [63] registered Jul 15 05:10:45.749907 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jul 15 05:10:45.749979 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jul 15 05:10:45.750034 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jul 15 05:10:45.750096 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jul 15 05:10:45.750156 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jul 15 05:10:45.750220 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jul 15 05:10:45.750279 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Jul 15 05:10:45.750333 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Jul 15 05:10:45.750385 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Jul 15 05:10:45.750437 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Jul 15 05:10:45.750490 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jul 15 05:10:45.750550 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jul 15 05:10:45.750607 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jul 15 05:10:45.750676 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jul 15 05:10:45.750996 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jul 15 05:10:45.751074 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jul 15 05:10:45.751131 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jul 15 05:10:45.751185 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jul 15 05:10:45.751242 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jul 15 05:10:45.751295 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jul 15 05:10:45.751352 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Jul 15 05:10:45.751406 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Jul 15 05:10:45.751458 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Jul 15 05:10:45.751509 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Jul 15 05:10:45.751560 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Jul 15 05:10:45.751614 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Jul 15 05:10:45.751671 kernel: pci 0000:0b:00.0: supports D1 D2 Jul 15 05:10:45.751738 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jul 15 05:10:45.751791 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jul 15 05:10:45.751842 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jul 15 05:10:45.751901 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jul 15 05:10:45.751968 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jul 15 05:10:45.752022 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jul 15 05:10:45.752078 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jul 15 05:10:45.752131 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jul 15 05:10:45.752184 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jul 15 05:10:45.752235 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jul 15 05:10:45.752287 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jul 15 05:10:45.752339 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jul 15 05:10:45.752391 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jul 15 05:10:45.752442 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jul 15 05:10:45.752496 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jul 15 05:10:45.752556 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jul 15 05:10:45.752616 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jul 15 05:10:45.752674 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jul 15 05:10:45.753391 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jul 15 05:10:45.753450 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jul 15 05:10:45.753875 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jul 15 05:10:45.753939 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jul 15 05:10:45.753994 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jul 15 05:10:45.754057 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jul 15 05:10:45.754113 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jul 15 05:10:45.754166 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jul 15 05:10:45.754175 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jul 15 05:10:45.754181 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jul 15 05:10:45.754189 kernel: ACPI: PCI: Interrupt link LNKB disabled Jul 15 05:10:45.754196 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jul 15 05:10:45.754202 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jul 15 05:10:45.754207 kernel: iommu: Default domain type: Translated Jul 15 05:10:45.754213 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jul 15 05:10:45.754219 kernel: PCI: Using ACPI for IRQ routing Jul 15 05:10:45.754225 kernel: PCI: pci_cache_line_size set to 64 bytes Jul 15 05:10:45.754231 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jul 15 05:10:45.754237 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jul 15 05:10:45.754288 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jul 15 05:10:45.754339 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jul 15 05:10:45.754389 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jul 15 05:10:45.754401 kernel: vgaarb: loaded Jul 15 05:10:45.754409 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jul 15 05:10:45.754416 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jul 15 05:10:45.754421 kernel: clocksource: Switched to clocksource tsc-early Jul 15 05:10:45.754430 kernel: VFS: Disk quotas dquot_6.6.0 Jul 15 05:10:45.754436 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 15 05:10:45.754444 kernel: pnp: PnP ACPI init Jul 15 05:10:45.754502 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jul 15 05:10:45.754550 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jul 15 05:10:45.754595 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jul 15 05:10:45.754645 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jul 15 05:10:45.754742 kernel: pnp 00:06: [dma 2] Jul 15 05:10:45.754794 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jul 15 05:10:45.754844 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jul 15 05:10:45.754891 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jul 15 05:10:45.754903 kernel: pnp: PnP ACPI: found 8 devices Jul 15 05:10:45.754909 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jul 15 05:10:45.754916 kernel: NET: Registered PF_INET protocol family Jul 15 05:10:45.754924 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 15 05:10:45.754932 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jul 15 05:10:45.754941 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 15 05:10:45.754947 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jul 15 05:10:45.754954 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jul 15 05:10:45.754960 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jul 15 05:10:45.754965 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 15 05:10:45.754971 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 15 05:10:45.754977 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 15 05:10:45.754983 kernel: NET: Registered PF_XDP protocol family Jul 15 05:10:45.755038 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jul 15 05:10:45.755094 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jul 15 05:10:45.755146 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 15 05:10:45.755748 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 15 05:10:45.755808 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 15 05:10:45.755863 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jul 15 05:10:45.755917 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jul 15 05:10:45.755969 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jul 15 05:10:45.756021 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jul 15 05:10:45.756078 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jul 15 05:10:45.756130 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jul 15 05:10:45.756182 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jul 15 05:10:45.756234 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jul 15 05:10:45.756286 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jul 15 05:10:45.756339 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jul 15 05:10:45.756401 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jul 15 05:10:45.756457 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jul 15 05:10:45.756510 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jul 15 05:10:45.756566 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jul 15 05:10:45.756617 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jul 15 05:10:45.756668 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jul 15 05:10:45.756749 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jul 15 05:10:45.756812 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jul 15 05:10:45.756864 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Jul 15 05:10:45.756923 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Jul 15 05:10:45.756981 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.757032 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.757082 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.757142 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.757194 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.757244 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.757434 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.757500 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.757554 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.757832 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.757889 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.757942 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.758007 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.758422 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.758488 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.758546 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.758600 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.758652 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.758720 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.758773 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.758824 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.758875 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.758927 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.758987 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.759046 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.759109 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.759165 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.759216 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.759287 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.759354 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.759414 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.759471 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.759523 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.759579 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.759648 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.759824 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.759889 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.759947 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.760005 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.760057 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.760313 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.760369 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.760422 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.760472 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.760523 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.760591 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.760644 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.760731 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.760797 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.760858 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.760930 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.760987 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761039 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.761090 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761141 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.761193 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761249 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.761305 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761360 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.761416 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761484 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.761555 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761621 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.761674 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761741 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.761798 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761856 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.761907 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.761959 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.762009 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.762060 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.762111 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.762162 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.762212 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.762267 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.762317 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.762374 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.762433 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.762486 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.762537 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.762588 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.762653 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.762749 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Jul 15 05:10:45.762807 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Jul 15 05:10:45.762873 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jul 15 05:10:45.762929 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jul 15 05:10:45.762981 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jul 15 05:10:45.763032 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jul 15 05:10:45.763082 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jul 15 05:10:45.763138 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Jul 15 05:10:45.763194 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jul 15 05:10:45.763250 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jul 15 05:10:45.763306 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jul 15 05:10:45.763357 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jul 15 05:10:45.763409 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jul 15 05:10:45.763460 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jul 15 05:10:45.763510 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jul 15 05:10:45.763564 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jul 15 05:10:45.763618 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jul 15 05:10:45.763672 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jul 15 05:10:45.763758 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jul 15 05:10:45.763811 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jul 15 05:10:45.763861 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jul 15 05:10:45.763921 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jul 15 05:10:45.763978 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jul 15 05:10:45.764039 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jul 15 05:10:45.764091 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jul 15 05:10:45.764154 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jul 15 05:10:45.764222 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jul 15 05:10:45.764274 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jul 15 05:10:45.764326 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jul 15 05:10:45.764377 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jul 15 05:10:45.764432 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jul 15 05:10:45.764489 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jul 15 05:10:45.764544 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jul 15 05:10:45.764595 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jul 15 05:10:45.764647 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jul 15 05:10:45.764718 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Jul 15 05:10:45.764773 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jul 15 05:10:45.764824 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jul 15 05:10:45.764884 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jul 15 05:10:45.764941 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jul 15 05:10:45.765003 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jul 15 05:10:45.765061 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jul 15 05:10:45.765113 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jul 15 05:10:45.765163 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jul 15 05:10:45.765226 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jul 15 05:10:45.765277 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jul 15 05:10:45.765337 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jul 15 05:10:45.765388 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jul 15 05:10:45.765440 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jul 15 05:10:45.765491 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jul 15 05:10:45.765541 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jul 15 05:10:45.765595 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jul 15 05:10:45.765656 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jul 15 05:10:45.765734 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jul 15 05:10:45.765788 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jul 15 05:10:45.765839 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jul 15 05:10:45.765901 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jul 15 05:10:45.765960 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jul 15 05:10:45.766020 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jul 15 05:10:45.766071 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jul 15 05:10:45.766124 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jul 15 05:10:45.766174 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jul 15 05:10:45.766224 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jul 15 05:10:45.766287 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jul 15 05:10:45.766346 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jul 15 05:10:45.766401 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jul 15 05:10:45.766452 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jul 15 05:10:45.766506 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jul 15 05:10:45.766570 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jul 15 05:10:45.766630 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jul 15 05:10:45.766681 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jul 15 05:10:45.766758 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jul 15 05:10:45.766810 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jul 15 05:10:45.766860 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jul 15 05:10:45.766914 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jul 15 05:10:45.766966 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jul 15 05:10:45.767016 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jul 15 05:10:45.767066 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jul 15 05:10:45.767119 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jul 15 05:10:45.767169 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jul 15 05:10:45.767219 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jul 15 05:10:45.767271 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jul 15 05:10:45.767324 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jul 15 05:10:45.767383 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jul 15 05:10:45.767438 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jul 15 05:10:45.767488 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jul 15 05:10:45.767539 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jul 15 05:10:45.767593 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jul 15 05:10:45.767654 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jul 15 05:10:45.767730 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jul 15 05:10:45.767786 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jul 15 05:10:45.767838 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jul 15 05:10:45.767888 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jul 15 05:10:45.767939 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jul 15 05:10:45.767990 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jul 15 05:10:45.768040 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jul 15 05:10:45.768100 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jul 15 05:10:45.768151 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jul 15 05:10:45.768206 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jul 15 05:10:45.768256 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jul 15 05:10:45.768308 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jul 15 05:10:45.768361 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jul 15 05:10:45.769715 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jul 15 05:10:45.769795 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jul 15 05:10:45.769854 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jul 15 05:10:45.769909 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jul 15 05:10:45.769964 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jul 15 05:10:45.770026 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jul 15 05:10:45.770086 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jul 15 05:10:45.770138 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jul 15 05:10:45.770190 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jul 15 05:10:45.770241 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jul 15 05:10:45.770291 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jul 15 05:10:45.770346 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jul 15 05:10:45.770397 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jul 15 05:10:45.770447 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jul 15 05:10:45.770497 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jul 15 05:10:45.770548 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jul 15 05:10:45.770602 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jul 15 05:10:45.770647 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jul 15 05:10:45.770708 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jul 15 05:10:45.770760 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jul 15 05:10:45.770812 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jul 15 05:10:45.770865 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jul 15 05:10:45.770911 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jul 15 05:10:45.770968 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jul 15 05:10:45.771015 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jul 15 05:10:45.771061 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jul 15 05:10:45.771110 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jul 15 05:10:45.771164 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jul 15 05:10:45.771210 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jul 15 05:10:45.771267 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jul 15 05:10:45.771321 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jul 15 05:10:45.771368 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jul 15 05:10:45.771413 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jul 15 05:10:45.771466 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jul 15 05:10:45.771513 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jul 15 05:10:45.771559 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jul 15 05:10:45.771608 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jul 15 05:10:45.771655 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jul 15 05:10:45.771724 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jul 15 05:10:45.771774 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jul 15 05:10:45.771826 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jul 15 05:10:45.771878 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jul 15 05:10:45.771936 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jul 15 05:10:45.771983 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jul 15 05:10:45.772033 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jul 15 05:10:45.772082 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jul 15 05:10:45.772132 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jul 15 05:10:45.772178 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jul 15 05:10:45.772223 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jul 15 05:10:45.772275 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jul 15 05:10:45.772320 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jul 15 05:10:45.772368 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jul 15 05:10:45.772418 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jul 15 05:10:45.772464 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jul 15 05:10:45.772515 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jul 15 05:10:45.772577 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jul 15 05:10:45.772626 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jul 15 05:10:45.772676 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jul 15 05:10:45.772785 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jul 15 05:10:45.772850 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jul 15 05:10:45.772897 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jul 15 05:10:45.772948 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jul 15 05:10:45.772994 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jul 15 05:10:45.773044 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jul 15 05:10:45.773094 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jul 15 05:10:45.773143 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jul 15 05:10:45.773189 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jul 15 05:10:45.773234 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jul 15 05:10:45.773285 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jul 15 05:10:45.773331 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jul 15 05:10:45.773379 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jul 15 05:10:45.773429 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jul 15 05:10:45.773475 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jul 15 05:10:45.773532 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jul 15 05:10:45.773591 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jul 15 05:10:45.773638 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jul 15 05:10:45.773688 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jul 15 05:10:45.773750 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jul 15 05:10:45.773813 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jul 15 05:10:45.773860 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jul 15 05:10:45.773913 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jul 15 05:10:45.773960 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jul 15 05:10:45.774011 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jul 15 05:10:45.774060 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jul 15 05:10:45.774110 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jul 15 05:10:45.774158 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jul 15 05:10:45.774203 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jul 15 05:10:45.774253 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jul 15 05:10:45.774332 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jul 15 05:10:45.774384 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jul 15 05:10:45.774439 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jul 15 05:10:45.774485 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jul 15 05:10:45.774536 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jul 15 05:10:45.774583 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jul 15 05:10:45.774634 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jul 15 05:10:45.774680 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jul 15 05:10:45.774745 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jul 15 05:10:45.774792 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jul 15 05:10:45.774844 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jul 15 05:10:45.774892 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jul 15 05:10:45.774942 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jul 15 05:10:45.775000 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jul 15 05:10:45.775066 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jul 15 05:10:45.775077 kernel: PCI: CLS 32 bytes, default 64 Jul 15 05:10:45.775084 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jul 15 05:10:45.775091 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jul 15 05:10:45.775097 kernel: clocksource: Switched to clocksource tsc Jul 15 05:10:45.775103 kernel: Initialise system trusted keyrings Jul 15 05:10:45.775109 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jul 15 05:10:45.775114 kernel: Key type asymmetric registered Jul 15 05:10:45.775120 kernel: Asymmetric key parser 'x509' registered Jul 15 05:10:45.775128 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jul 15 05:10:45.775134 kernel: io scheduler mq-deadline registered Jul 15 05:10:45.775140 kernel: io scheduler kyber registered Jul 15 05:10:45.775146 kernel: io scheduler bfq registered Jul 15 05:10:45.775204 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jul 15 05:10:45.775261 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.775314 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jul 15 05:10:45.775368 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.775420 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jul 15 05:10:45.775471 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.775524 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jul 15 05:10:45.775576 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.775628 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jul 15 05:10:45.775679 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.775745 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jul 15 05:10:45.775804 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.775856 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jul 15 05:10:45.775943 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776015 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jul 15 05:10:45.776074 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776140 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jul 15 05:10:45.776191 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776246 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jul 15 05:10:45.776304 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776362 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jul 15 05:10:45.776415 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776467 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jul 15 05:10:45.776518 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776586 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jul 15 05:10:45.776640 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776704 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jul 15 05:10:45.776761 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776814 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jul 15 05:10:45.776880 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.776937 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jul 15 05:10:45.776988 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.777041 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jul 15 05:10:45.777095 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.777147 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jul 15 05:10:45.777198 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.777250 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jul 15 05:10:45.777300 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.777352 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jul 15 05:10:45.777403 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.777472 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jul 15 05:10:45.777527 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.777580 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jul 15 05:10:45.777632 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.777683 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jul 15 05:10:45.778179 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.778238 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jul 15 05:10:45.778293 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.778350 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jul 15 05:10:45.778401 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.778455 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jul 15 05:10:45.778507 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.778567 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jul 15 05:10:45.778628 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.778686 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jul 15 05:10:45.778761 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.778817 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jul 15 05:10:45.778877 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.779449 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jul 15 05:10:45.779524 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.779592 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jul 15 05:10:45.779646 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.779715 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jul 15 05:10:45.779773 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jul 15 05:10:45.779784 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jul 15 05:10:45.779792 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 15 05:10:45.779799 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jul 15 05:10:45.779805 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jul 15 05:10:45.779812 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jul 15 05:10:45.779818 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jul 15 05:10:45.779872 kernel: rtc_cmos 00:01: registered as rtc0 Jul 15 05:10:45.779920 kernel: rtc_cmos 00:01: setting system clock to 2025-07-15T05:10:45 UTC (1752556245) Jul 15 05:10:45.779930 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jul 15 05:10:45.779973 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jul 15 05:10:45.779982 kernel: intel_pstate: CPU model not supported Jul 15 05:10:45.779989 kernel: NET: Registered PF_INET6 protocol family Jul 15 05:10:45.779995 kernel: Segment Routing with IPv6 Jul 15 05:10:45.780001 kernel: In-situ OAM (IOAM) with IPv6 Jul 15 05:10:45.780009 kernel: NET: Registered PF_PACKET protocol family Jul 15 05:10:45.780016 kernel: Key type dns_resolver registered Jul 15 05:10:45.780022 kernel: IPI shorthand broadcast: enabled Jul 15 05:10:45.780029 kernel: sched_clock: Marking stable (2789099910, 179545852)->(2985462360, -16816598) Jul 15 05:10:45.780036 kernel: registered taskstats version 1 Jul 15 05:10:45.780042 kernel: Loading compiled-in X.509 certificates Jul 15 05:10:45.780049 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: a24478b628e55368911ce1800a2bd6bc158938c7' Jul 15 05:10:45.780055 kernel: Demotion targets for Node 0: null Jul 15 05:10:45.780061 kernel: Key type .fscrypt registered Jul 15 05:10:45.780068 kernel: Key type fscrypt-provisioning registered Jul 15 05:10:45.780074 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 15 05:10:45.780081 kernel: ima: Allocated hash algorithm: sha1 Jul 15 05:10:45.780087 kernel: ima: No architecture policies found Jul 15 05:10:45.780093 kernel: clk: Disabling unused clocks Jul 15 05:10:45.780100 kernel: Warning: unable to open an initial console. Jul 15 05:10:45.780107 kernel: Freeing unused kernel image (initmem) memory: 54608K Jul 15 05:10:45.780113 kernel: Write protecting the kernel read-only data: 24576k Jul 15 05:10:45.780120 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jul 15 05:10:45.780127 kernel: Run /init as init process Jul 15 05:10:45.780136 kernel: with arguments: Jul 15 05:10:45.780147 kernel: /init Jul 15 05:10:45.780154 kernel: with environment: Jul 15 05:10:45.780160 kernel: HOME=/ Jul 15 05:10:45.780166 kernel: TERM=linux Jul 15 05:10:45.780174 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 15 05:10:45.780182 systemd[1]: Successfully made /usr/ read-only. Jul 15 05:10:45.780191 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:10:45.780200 systemd[1]: Detected virtualization vmware. Jul 15 05:10:45.780206 systemd[1]: Detected architecture x86-64. Jul 15 05:10:45.780212 systemd[1]: Running in initrd. Jul 15 05:10:45.780218 systemd[1]: No hostname configured, using default hostname. Jul 15 05:10:45.780225 systemd[1]: Hostname set to . Jul 15 05:10:45.780231 systemd[1]: Initializing machine ID from random generator. Jul 15 05:10:45.780238 systemd[1]: Queued start job for default target initrd.target. Jul 15 05:10:45.780245 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:10:45.780252 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:10:45.780260 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 15 05:10:45.780266 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:10:45.780275 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 15 05:10:45.780282 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 15 05:10:45.780289 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 15 05:10:45.780297 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 15 05:10:45.780303 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:10:45.780310 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:10:45.780317 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:10:45.780323 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:10:45.780330 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:10:45.780337 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:10:45.780343 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:10:45.780351 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:10:45.780358 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 15 05:10:45.780365 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 15 05:10:45.780372 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:10:45.780378 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:10:45.780385 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:10:45.780392 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:10:45.780402 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 15 05:10:45.780412 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:10:45.780423 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 15 05:10:45.780430 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 15 05:10:45.780436 systemd[1]: Starting systemd-fsck-usr.service... Jul 15 05:10:45.780442 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:10:45.780449 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:10:45.780456 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:10:45.780464 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 15 05:10:45.780473 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:10:45.780480 systemd[1]: Finished systemd-fsck-usr.service. Jul 15 05:10:45.780486 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 05:10:45.780509 systemd-journald[243]: Collecting audit messages is disabled. Jul 15 05:10:45.780527 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:10:45.780534 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 15 05:10:45.780541 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:10:45.780557 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:10:45.780568 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:10:45.780580 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 15 05:10:45.780587 kernel: Bridge firewalling registered Jul 15 05:10:45.780593 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:10:45.780600 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:10:45.780606 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 15 05:10:45.780613 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:10:45.780620 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:10:45.780627 systemd-journald[243]: Journal started Jul 15 05:10:45.780644 systemd-journald[243]: Runtime Journal (/run/log/journal/7536435b1b1b462e8a82d41a426b922f) is 4.8M, max 38.8M, 34M free. Jul 15 05:10:45.725940 systemd-modules-load[245]: Inserted module 'overlay' Jul 15 05:10:45.761073 systemd-modules-load[245]: Inserted module 'br_netfilter' Jul 15 05:10:45.783711 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:10:45.785417 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:10:45.785931 dracut-cmdline[268]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:10:45.794610 systemd-tmpfiles[290]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 15 05:10:45.797462 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:10:45.798750 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:10:45.830263 systemd-resolved[318]: Positive Trust Anchors: Jul 15 05:10:45.830275 systemd-resolved[318]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:10:45.830298 systemd-resolved[318]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:10:45.832239 systemd-resolved[318]: Defaulting to hostname 'linux'. Jul 15 05:10:45.833270 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:10:45.833893 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:10:45.850715 kernel: SCSI subsystem initialized Jul 15 05:10:45.868709 kernel: Loading iSCSI transport class v2.0-870. Jul 15 05:10:45.876708 kernel: iscsi: registered transport (tcp) Jul 15 05:10:45.899708 kernel: iscsi: registered transport (qla4xxx) Jul 15 05:10:45.899753 kernel: QLogic iSCSI HBA Driver Jul 15 05:10:45.909528 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:10:45.921713 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:10:45.922523 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:10:45.944885 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 15 05:10:45.945717 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 15 05:10:45.982714 kernel: raid6: avx2x4 gen() 43570 MB/s Jul 15 05:10:45.999700 kernel: raid6: avx2x2 gen() 52864 MB/s Jul 15 05:10:46.016868 kernel: raid6: avx2x1 gen() 44537 MB/s Jul 15 05:10:46.016894 kernel: raid6: using algorithm avx2x2 gen() 52864 MB/s Jul 15 05:10:46.034890 kernel: raid6: .... xor() 31795 MB/s, rmw enabled Jul 15 05:10:46.034925 kernel: raid6: using avx2x2 recovery algorithm Jul 15 05:10:46.048706 kernel: xor: automatically using best checksumming function avx Jul 15 05:10:46.152715 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 15 05:10:46.155916 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:10:46.156850 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:10:46.175855 systemd-udevd[492]: Using default interface naming scheme 'v255'. Jul 15 05:10:46.179378 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:10:46.180383 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 15 05:10:46.195246 dracut-pre-trigger[497]: rd.md=0: removing MD RAID activation Jul 15 05:10:46.209527 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:10:46.210470 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:10:46.289429 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:10:46.290849 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 15 05:10:46.363707 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jul 15 05:10:46.367569 kernel: vmw_pvscsi: using 64bit dma Jul 15 05:10:46.367604 kernel: vmw_pvscsi: max_id: 16 Jul 15 05:10:46.367612 kernel: vmw_pvscsi: setting ring_pages to 8 Jul 15 05:10:46.375007 kernel: vmw_pvscsi: enabling reqCallThreshold Jul 15 05:10:46.375042 kernel: vmw_pvscsi: driver-based request coalescing enabled Jul 15 05:10:46.375050 kernel: vmw_pvscsi: using MSI-X Jul 15 05:10:46.380705 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Jul 15 05:10:46.384706 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jul 15 05:10:46.387833 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jul 15 05:10:46.391640 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jul 15 05:10:46.391995 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jul 15 05:10:46.392014 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jul 15 05:10:46.397023 (udev-worker)[541]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jul 15 05:10:46.399711 kernel: cryptd: max_cpu_qlen set to 1000 Jul 15 05:10:46.402719 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jul 15 05:10:46.409774 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:10:46.409871 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:10:46.410434 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:10:46.411880 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:10:46.420742 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jul 15 05:10:46.421235 kernel: sd 0:0:0:0: [sda] Write Protect is off Jul 15 05:10:46.421417 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jul 15 05:10:46.421485 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jul 15 05:10:46.421609 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jul 15 05:10:46.424572 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Jul 15 05:10:46.425703 kernel: libata version 3.00 loaded. Jul 15 05:10:46.431719 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:10:46.433705 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jul 15 05:10:46.438764 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:10:46.442181 kernel: AES CTR mode by8 optimization enabled Jul 15 05:10:46.442214 kernel: ata_piix 0000:00:07.1: version 2.13 Jul 15 05:10:46.443709 kernel: scsi host1: ata_piix Jul 15 05:10:46.446085 kernel: scsi host2: ata_piix Jul 15 05:10:46.446177 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Jul 15 05:10:46.446187 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Jul 15 05:10:46.485381 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jul 15 05:10:46.490579 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jul 15 05:10:46.495947 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jul 15 05:10:46.500211 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jul 15 05:10:46.500334 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jul 15 05:10:46.501046 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 15 05:10:46.543707 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:10:46.557706 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:10:46.614706 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jul 15 05:10:46.620717 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jul 15 05:10:46.650925 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jul 15 05:10:46.651286 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 15 05:10:46.659725 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jul 15 05:10:46.965480 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 15 05:10:46.965996 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:10:46.966159 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:10:46.966511 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:10:46.967547 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 15 05:10:46.985679 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:10:47.609000 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:10:47.609261 disk-uuid[637]: The operation has completed successfully. Jul 15 05:10:48.020180 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 15 05:10:48.020240 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 15 05:10:48.021072 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 15 05:10:48.047668 sh[678]: Success Jul 15 05:10:48.065707 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 15 05:10:48.065756 kernel: device-mapper: uevent: version 1.0.3 Jul 15 05:10:48.068047 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 15 05:10:48.073712 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Jul 15 05:10:48.250234 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 15 05:10:48.251475 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 15 05:10:48.266073 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 15 05:10:48.279626 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 15 05:10:48.279671 kernel: BTRFS: device fsid eb96c768-dac4-4ca9-ae1d-82815d4ce00b devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (690) Jul 15 05:10:48.282134 kernel: BTRFS info (device dm-0): first mount of filesystem eb96c768-dac4-4ca9-ae1d-82815d4ce00b Jul 15 05:10:48.282165 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:10:48.282173 kernel: BTRFS info (device dm-0): using free-space-tree Jul 15 05:10:48.292079 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 15 05:10:48.292447 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:10:48.293062 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jul 15 05:10:48.294774 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 15 05:10:48.327713 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (713) Jul 15 05:10:48.329754 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:10:48.329782 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:10:48.331303 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:10:48.348718 kernel: BTRFS info (device sda6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:10:48.351765 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 15 05:10:48.353571 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 15 05:10:48.381203 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jul 15 05:10:48.382596 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 15 05:10:48.485505 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:10:48.487090 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:10:48.490723 ignition[732]: Ignition 2.21.0 Jul 15 05:10:48.490730 ignition[732]: Stage: fetch-offline Jul 15 05:10:48.490748 ignition[732]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:10:48.490754 ignition[732]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 15 05:10:48.490831 ignition[732]: parsed url from cmdline: "" Jul 15 05:10:48.490833 ignition[732]: no config URL provided Jul 15 05:10:48.490836 ignition[732]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:10:48.490840 ignition[732]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:10:48.491207 ignition[732]: config successfully fetched Jul 15 05:10:48.491227 ignition[732]: parsing config with SHA512: 841e7961b867a58b72cb6ef209ff225ef5c8ecf97cb4be2298e7e6b1f05a37cfb669bf9f3147ed103c482ea252b80a691fa840a0be86bff7d901a3505362df35 Jul 15 05:10:48.494678 unknown[732]: fetched base config from "system" Jul 15 05:10:48.494686 unknown[732]: fetched user config from "vmware" Jul 15 05:10:48.494969 ignition[732]: fetch-offline: fetch-offline passed Jul 15 05:10:48.495006 ignition[732]: Ignition finished successfully Jul 15 05:10:48.496158 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:10:48.511743 systemd-networkd[884]: lo: Link UP Jul 15 05:10:48.511749 systemd-networkd[884]: lo: Gained carrier Jul 15 05:10:48.512706 systemd-networkd[884]: Enumeration completed Jul 15 05:10:48.512934 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:10:48.513100 systemd[1]: Reached target network.target - Network. Jul 15 05:10:48.513208 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jul 15 05:10:48.513677 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 15 05:10:48.516925 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jul 15 05:10:48.517030 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jul 15 05:10:48.513700 systemd-networkd[884]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jul 15 05:10:48.516648 systemd-networkd[884]: ens192: Link UP Jul 15 05:10:48.516650 systemd-networkd[884]: ens192: Gained carrier Jul 15 05:10:48.533860 ignition[888]: Ignition 2.21.0 Jul 15 05:10:48.534139 ignition[888]: Stage: kargs Jul 15 05:10:48.534307 ignition[888]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:10:48.534422 ignition[888]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 15 05:10:48.535241 ignition[888]: kargs: kargs passed Jul 15 05:10:48.535290 ignition[888]: Ignition finished successfully Jul 15 05:10:48.536826 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 15 05:10:48.537583 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 15 05:10:48.553054 ignition[896]: Ignition 2.21.0 Jul 15 05:10:48.553064 ignition[896]: Stage: disks Jul 15 05:10:48.553234 ignition[896]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:10:48.553245 ignition[896]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 15 05:10:48.553995 ignition[896]: disks: disks passed Jul 15 05:10:48.554027 ignition[896]: Ignition finished successfully Jul 15 05:10:48.554707 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 15 05:10:48.555143 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 15 05:10:48.555279 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 15 05:10:48.555463 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:10:48.555644 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:10:48.555835 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:10:48.556514 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 15 05:10:48.610393 systemd-fsck[904]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 15 05:10:48.612027 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 15 05:10:48.612770 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 15 05:10:48.726706 kernel: EXT4-fs (sda9): mounted filesystem 277c3938-5262-4ab1-8fa3-62fde82f8257 r/w with ordered data mode. Quota mode: none. Jul 15 05:10:48.727557 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 15 05:10:48.728150 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 15 05:10:48.729743 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:10:48.731742 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 15 05:10:48.732315 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jul 15 05:10:48.732528 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 15 05:10:48.732546 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:10:48.741558 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 15 05:10:48.742613 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 15 05:10:48.751715 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (912) Jul 15 05:10:48.755837 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:10:48.755872 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:10:48.755881 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:10:48.764155 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:10:48.785051 systemd-resolved[318]: Detected conflict on linux IN A 139.178.70.102 Jul 15 05:10:48.785061 systemd-resolved[318]: Hostname conflict, changing published hostname from 'linux' to 'linux7'. Jul 15 05:10:48.788202 initrd-setup-root[936]: cut: /sysroot/etc/passwd: No such file or directory Jul 15 05:10:48.791480 initrd-setup-root[943]: cut: /sysroot/etc/group: No such file or directory Jul 15 05:10:48.794085 initrd-setup-root[950]: cut: /sysroot/etc/shadow: No such file or directory Jul 15 05:10:48.797041 initrd-setup-root[957]: cut: /sysroot/etc/gshadow: No such file or directory Jul 15 05:10:48.884670 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 15 05:10:48.885544 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 15 05:10:48.886771 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 15 05:10:48.895708 kernel: BTRFS info (device sda6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:10:48.910653 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 15 05:10:48.913464 ignition[1025]: INFO : Ignition 2.21.0 Jul 15 05:10:48.913464 ignition[1025]: INFO : Stage: mount Jul 15 05:10:48.914137 ignition[1025]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:10:48.914137 ignition[1025]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 15 05:10:48.915794 ignition[1025]: INFO : mount: mount passed Jul 15 05:10:48.915911 ignition[1025]: INFO : Ignition finished successfully Jul 15 05:10:48.916794 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 15 05:10:48.917438 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 15 05:10:49.277987 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 15 05:10:49.278962 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:10:49.370442 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1037) Jul 15 05:10:49.370478 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:10:49.372148 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:10:49.372199 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:10:49.378129 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:10:49.397923 ignition[1054]: INFO : Ignition 2.21.0 Jul 15 05:10:49.397923 ignition[1054]: INFO : Stage: files Jul 15 05:10:49.397923 ignition[1054]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:10:49.397923 ignition[1054]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 15 05:10:49.397923 ignition[1054]: DEBUG : files: compiled without relabeling support, skipping Jul 15 05:10:49.399369 ignition[1054]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 15 05:10:49.399369 ignition[1054]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 15 05:10:49.401930 ignition[1054]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 15 05:10:49.402252 ignition[1054]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 15 05:10:49.402499 unknown[1054]: wrote ssh authorized keys file for user: core Jul 15 05:10:49.402826 ignition[1054]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 15 05:10:49.404498 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jul 15 05:10:49.404498 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jul 15 05:10:49.439599 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 15 05:10:49.570168 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jul 15 05:10:49.570168 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 15 05:10:49.570168 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 15 05:10:49.570168 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:10:49.570168 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:10:49.570168 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:10:49.569982 systemd-networkd[884]: ens192: Gained IPv6LL Jul 15 05:10:49.571459 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:10:49.571459 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:10:49.571459 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:10:49.571459 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:10:49.572151 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:10:49.572151 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:10:49.573861 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:10:49.574091 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:10:49.574091 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Jul 15 05:10:50.088005 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 15 05:10:50.570801 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:10:50.570801 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jul 15 05:10:50.582289 ignition[1054]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jul 15 05:10:50.582526 ignition[1054]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jul 15 05:10:50.590238 ignition[1054]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:10:50.592878 ignition[1054]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:10:50.592878 ignition[1054]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jul 15 05:10:50.593285 ignition[1054]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jul 15 05:10:50.593285 ignition[1054]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jul 15 05:10:50.593285 ignition[1054]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jul 15 05:10:50.593285 ignition[1054]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jul 15 05:10:50.593285 ignition[1054]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jul 15 05:10:51.091986 ignition[1054]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jul 15 05:10:51.094629 ignition[1054]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jul 15 05:10:51.095046 ignition[1054]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jul 15 05:10:51.095330 ignition[1054]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jul 15 05:10:51.095330 ignition[1054]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jul 15 05:10:51.095995 ignition[1054]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:10:51.095995 ignition[1054]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:10:51.095995 ignition[1054]: INFO : files: files passed Jul 15 05:10:51.097118 ignition[1054]: INFO : Ignition finished successfully Jul 15 05:10:51.097546 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 15 05:10:51.098377 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 15 05:10:51.099825 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 15 05:10:51.116706 initrd-setup-root-after-ignition[1084]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:10:51.116706 initrd-setup-root-after-ignition[1084]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:10:51.117890 initrd-setup-root-after-ignition[1088]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:10:51.118514 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 15 05:10:51.118587 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 15 05:10:51.118934 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:10:51.119427 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 15 05:10:51.119998 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 15 05:10:51.139995 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 15 05:10:51.140062 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 15 05:10:51.140368 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 15 05:10:51.140613 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 15 05:10:51.140841 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 15 05:10:51.141327 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 15 05:10:51.156879 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:10:51.157684 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 15 05:10:51.168658 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:10:51.168866 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:10:51.169098 systemd[1]: Stopped target timers.target - Timer Units. Jul 15 05:10:51.169306 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 15 05:10:51.169378 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:10:51.169799 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 15 05:10:51.169950 systemd[1]: Stopped target basic.target - Basic System. Jul 15 05:10:51.170135 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 15 05:10:51.170334 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:10:51.170548 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 15 05:10:51.170776 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:10:51.170999 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 15 05:10:51.171203 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:10:51.171428 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 15 05:10:51.171637 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 15 05:10:51.171852 systemd[1]: Stopped target swap.target - Swaps. Jul 15 05:10:51.172013 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 15 05:10:51.172082 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:10:51.172343 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:10:51.172580 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:10:51.172791 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 15 05:10:51.172841 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:10:51.173020 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 15 05:10:51.173081 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 15 05:10:51.173351 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 15 05:10:51.173413 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:10:51.173663 systemd[1]: Stopped target paths.target - Path Units. Jul 15 05:10:51.173803 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 15 05:10:51.177713 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:10:51.177890 systemd[1]: Stopped target slices.target - Slice Units. Jul 15 05:10:51.178097 systemd[1]: Stopped target sockets.target - Socket Units. Jul 15 05:10:51.178304 systemd[1]: iscsid.socket: Deactivated successfully. Jul 15 05:10:51.178356 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:10:51.178613 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 15 05:10:51.178679 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:10:51.178921 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 15 05:10:51.178989 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:10:51.179295 systemd[1]: ignition-files.service: Deactivated successfully. Jul 15 05:10:51.179380 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 15 05:10:51.180132 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 15 05:10:51.180233 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 15 05:10:51.180321 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:10:51.181755 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 15 05:10:51.181921 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 15 05:10:51.181997 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:10:51.182209 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 15 05:10:51.182291 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:10:51.186467 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 15 05:10:51.186702 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 15 05:10:51.195760 ignition[1109]: INFO : Ignition 2.21.0 Jul 15 05:10:51.196445 ignition[1109]: INFO : Stage: umount Jul 15 05:10:51.196596 ignition[1109]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:10:51.196596 ignition[1109]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jul 15 05:10:51.197572 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 15 05:10:51.199481 ignition[1109]: INFO : umount: umount passed Jul 15 05:10:51.199874 ignition[1109]: INFO : Ignition finished successfully Jul 15 05:10:51.201153 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 15 05:10:51.201235 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 15 05:10:51.201514 systemd[1]: Stopped target network.target - Network. Jul 15 05:10:51.201624 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 15 05:10:51.201652 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 15 05:10:51.201803 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 15 05:10:51.201825 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 15 05:10:51.201973 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 15 05:10:51.201993 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 15 05:10:51.202147 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 15 05:10:51.202168 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 15 05:10:51.202373 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 15 05:10:51.202673 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 15 05:10:51.207779 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 15 05:10:51.208024 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 15 05:10:51.209584 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 15 05:10:51.209891 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 15 05:10:51.210091 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 15 05:10:51.210966 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 15 05:10:51.211420 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 15 05:10:51.211632 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 15 05:10:51.211653 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:10:51.212276 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 15 05:10:51.212524 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 15 05:10:51.212548 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:10:51.212682 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jul 15 05:10:51.212753 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jul 15 05:10:51.212877 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 15 05:10:51.212899 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:10:51.213166 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 15 05:10:51.213187 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 15 05:10:51.213576 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 15 05:10:51.213597 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:10:51.214045 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:10:51.215620 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 15 05:10:51.215657 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:10:51.224594 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 15 05:10:51.224674 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 15 05:10:51.225103 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 15 05:10:51.225189 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:10:51.225567 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 15 05:10:51.225599 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 15 05:10:51.225734 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 15 05:10:51.225757 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:10:51.225921 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 15 05:10:51.225946 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:10:51.226224 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 15 05:10:51.226255 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 15 05:10:51.226530 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 15 05:10:51.226558 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:10:51.227340 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 15 05:10:51.227456 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 15 05:10:51.227484 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:10:51.227675 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 15 05:10:51.227767 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:10:51.228054 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:10:51.228077 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:10:51.229276 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jul 15 05:10:51.229305 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jul 15 05:10:51.229326 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:10:51.238292 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 15 05:10:51.238352 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 15 05:10:51.338391 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 15 05:10:51.338475 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 15 05:10:51.338863 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 15 05:10:51.339084 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 15 05:10:51.339116 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 15 05:10:51.339739 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 15 05:10:51.350093 systemd[1]: Switching root. Jul 15 05:10:51.383912 systemd-journald[243]: Journal stopped Jul 15 05:10:52.971150 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Jul 15 05:10:52.971179 kernel: SELinux: policy capability network_peer_controls=1 Jul 15 05:10:52.971187 kernel: SELinux: policy capability open_perms=1 Jul 15 05:10:52.971193 kernel: SELinux: policy capability extended_socket_class=1 Jul 15 05:10:52.971199 kernel: SELinux: policy capability always_check_network=0 Jul 15 05:10:52.971206 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 15 05:10:52.971212 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 15 05:10:52.971218 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 15 05:10:52.971224 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 15 05:10:52.971230 kernel: SELinux: policy capability userspace_initial_context=0 Jul 15 05:10:52.971235 kernel: audit: type=1403 audit(1752556252.345:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 15 05:10:52.971242 systemd[1]: Successfully loaded SELinux policy in 58.226ms. Jul 15 05:10:52.971250 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.767ms. Jul 15 05:10:52.971258 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:10:52.971265 systemd[1]: Detected virtualization vmware. Jul 15 05:10:52.971271 systemd[1]: Detected architecture x86-64. Jul 15 05:10:52.971279 systemd[1]: Detected first boot. Jul 15 05:10:52.971286 systemd[1]: Initializing machine ID from random generator. Jul 15 05:10:52.971293 zram_generator::config[1152]: No configuration found. Jul 15 05:10:52.971660 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Jul 15 05:10:52.971674 kernel: Guest personality initialized and is active Jul 15 05:10:52.971681 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jul 15 05:10:52.971687 kernel: Initialized host personality Jul 15 05:10:52.971707 kernel: NET: Registered PF_VSOCK protocol family Jul 15 05:10:52.971716 systemd[1]: Populated /etc with preset unit settings. Jul 15 05:10:52.971725 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 15 05:10:52.971732 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Jul 15 05:10:52.971739 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 15 05:10:52.971746 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 15 05:10:52.971752 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 15 05:10:52.971761 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 15 05:10:52.971768 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 15 05:10:52.971775 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 15 05:10:52.971782 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 15 05:10:52.971789 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 15 05:10:52.971796 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 15 05:10:52.971802 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 15 05:10:52.971811 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 15 05:10:52.971818 systemd[1]: Created slice user.slice - User and Session Slice. Jul 15 05:10:52.971825 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:10:52.971833 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:10:52.971841 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 15 05:10:52.971847 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 15 05:10:52.971854 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 15 05:10:52.971862 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:10:52.971870 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jul 15 05:10:52.971877 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:10:52.971884 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:10:52.971891 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 15 05:10:52.971897 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 15 05:10:52.971904 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 15 05:10:52.971911 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 15 05:10:52.971918 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:10:52.972353 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:10:52.972363 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:10:52.972371 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:10:52.972378 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 15 05:10:52.972385 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 15 05:10:52.972395 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 15 05:10:52.972402 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:10:52.972409 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:10:52.972416 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:10:52.972423 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 15 05:10:52.972430 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 15 05:10:52.972437 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 15 05:10:52.972444 systemd[1]: Mounting media.mount - External Media Directory... Jul 15 05:10:52.972453 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:10:52.972460 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 15 05:10:52.972467 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 15 05:10:52.972474 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 15 05:10:52.972481 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 15 05:10:52.972489 systemd[1]: Reached target machines.target - Containers. Jul 15 05:10:52.972496 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 15 05:10:52.972503 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Jul 15 05:10:52.972511 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:10:52.972518 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 15 05:10:52.972525 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:10:52.972532 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:10:52.972539 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:10:52.972548 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 15 05:10:52.972555 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:10:52.972562 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 15 05:10:52.972570 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 15 05:10:52.972577 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 15 05:10:52.972584 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 15 05:10:52.972591 systemd[1]: Stopped systemd-fsck-usr.service. Jul 15 05:10:52.972599 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:10:52.972606 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:10:52.972613 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:10:52.972620 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:10:52.972628 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 15 05:10:52.972636 kernel: fuse: init (API version 7.41) Jul 15 05:10:52.972642 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 15 05:10:52.972650 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:10:52.972657 systemd[1]: verity-setup.service: Deactivated successfully. Jul 15 05:10:52.972663 systemd[1]: Stopped verity-setup.service. Jul 15 05:10:52.972671 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:10:52.972678 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 15 05:10:52.972685 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 15 05:10:52.972705 systemd[1]: Mounted media.mount - External Media Directory. Jul 15 05:10:52.972715 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 15 05:10:52.972722 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 15 05:10:52.972729 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 15 05:10:52.972736 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:10:52.972743 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 15 05:10:52.972749 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 15 05:10:52.972757 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:10:52.972770 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:10:52.972777 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:10:52.972785 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:10:52.972792 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 15 05:10:52.972800 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 15 05:10:52.972807 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:10:52.972814 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 15 05:10:52.972821 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 15 05:10:52.972828 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:10:52.972836 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:10:52.972859 systemd-journald[1242]: Collecting audit messages is disabled. Jul 15 05:10:52.972881 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 15 05:10:52.972889 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 15 05:10:52.972896 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 15 05:10:52.972903 kernel: ACPI: bus type drm_connector registered Jul 15 05:10:52.972911 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:10:52.972919 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 15 05:10:52.972926 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:10:52.972933 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 15 05:10:52.972942 systemd-journald[1242]: Journal started Jul 15 05:10:52.972958 systemd-journald[1242]: Runtime Journal (/run/log/journal/d94cb292aa9b4d51b16e02c30365b3b9) is 4.8M, max 38.8M, 34M free. Jul 15 05:10:52.757571 systemd[1]: Queued start job for default target multi-user.target. Jul 15 05:10:52.769874 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 15 05:10:52.770116 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 15 05:10:52.973464 jq[1222]: true Jul 15 05:10:52.974453 jq[1258]: true Jul 15 05:10:52.976462 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 15 05:10:52.976498 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:10:52.983702 kernel: loop: module loaded Jul 15 05:10:52.989324 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 15 05:10:52.989366 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:10:52.991937 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 15 05:10:52.995958 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 15 05:10:52.995998 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:10:52.997505 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:10:52.998745 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:10:52.999013 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:10:53.001476 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:10:53.002485 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 15 05:10:53.012322 ignition[1272]: Ignition 2.21.0 Jul 15 05:10:53.014677 ignition[1272]: deleting config from guestinfo properties Jul 15 05:10:53.022912 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 15 05:10:53.023118 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:10:53.024762 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 15 05:10:53.025723 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:10:53.026044 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 15 05:10:53.026382 ignition[1272]: Successfully deleted config Jul 15 05:10:53.029375 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 15 05:10:53.034523 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 15 05:10:53.035714 kernel: loop0: detected capacity change from 0 to 221472 Jul 15 05:10:53.036805 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 15 05:10:53.037470 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Jul 15 05:10:53.043568 systemd-journald[1242]: Time spent on flushing to /var/log/journal/d94cb292aa9b4d51b16e02c30365b3b9 is 30.375ms for 1769 entries. Jul 15 05:10:53.043568 systemd-journald[1242]: System Journal (/var/log/journal/d94cb292aa9b4d51b16e02c30365b3b9) is 8M, max 584.8M, 576.8M free. Jul 15 05:10:53.090309 systemd-journald[1242]: Received client request to flush runtime journal. Jul 15 05:10:53.091420 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 15 05:10:53.094084 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 15 05:10:53.106855 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:10:53.111806 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 15 05:10:53.120205 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 15 05:10:53.122795 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:10:53.131803 kernel: loop1: detected capacity change from 0 to 146488 Jul 15 05:10:53.163272 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Jul 15 05:10:53.163285 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Jul 15 05:10:53.168639 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:10:53.193857 kernel: loop2: detected capacity change from 0 to 2960 Jul 15 05:10:53.222714 kernel: loop3: detected capacity change from 0 to 114000 Jul 15 05:10:53.283737 kernel: loop4: detected capacity change from 0 to 221472 Jul 15 05:10:53.321816 kernel: loop5: detected capacity change from 0 to 146488 Jul 15 05:10:53.622714 kernel: loop6: detected capacity change from 0 to 2960 Jul 15 05:10:53.752710 kernel: loop7: detected capacity change from 0 to 114000 Jul 15 05:10:53.771319 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 15 05:10:53.787504 (sd-merge)[1326]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Jul 15 05:10:53.788417 (sd-merge)[1326]: Merged extensions into '/usr'. Jul 15 05:10:53.793379 systemd[1]: Reload requested from client PID 1281 ('systemd-sysext') (unit systemd-sysext.service)... Jul 15 05:10:53.793499 systemd[1]: Reloading... Jul 15 05:10:53.840714 zram_generator::config[1349]: No configuration found. Jul 15 05:10:53.919409 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:10:53.929484 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 15 05:10:53.977901 systemd[1]: Reloading finished in 183 ms. Jul 15 05:10:53.990291 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 15 05:10:53.990784 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 15 05:10:53.998053 systemd[1]: Starting ensure-sysext.service... Jul 15 05:10:54.000320 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:10:54.005816 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:10:54.015970 systemd[1]: Reload requested from client PID 1408 ('systemctl') (unit ensure-sysext.service)... Jul 15 05:10:54.015983 systemd[1]: Reloading... Jul 15 05:10:54.017053 systemd-tmpfiles[1409]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 15 05:10:54.017077 systemd-tmpfiles[1409]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 15 05:10:54.017288 systemd-tmpfiles[1409]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 15 05:10:54.017465 systemd-tmpfiles[1409]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 15 05:10:54.018032 systemd-tmpfiles[1409]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 15 05:10:54.018218 systemd-tmpfiles[1409]: ACLs are not supported, ignoring. Jul 15 05:10:54.018257 systemd-tmpfiles[1409]: ACLs are not supported, ignoring. Jul 15 05:10:54.022286 systemd-tmpfiles[1409]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:10:54.022294 systemd-tmpfiles[1409]: Skipping /boot Jul 15 05:10:54.039385 systemd-tmpfiles[1409]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:10:54.039396 systemd-tmpfiles[1409]: Skipping /boot Jul 15 05:10:54.067560 systemd-udevd[1410]: Using default interface naming scheme 'v255'. Jul 15 05:10:54.082730 zram_generator::config[1437]: No configuration found. Jul 15 05:10:54.188846 ldconfig[1273]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 15 05:10:54.220420 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:10:54.232626 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 15 05:10:54.272332 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jul 15 05:10:54.283704 kernel: ACPI: button: Power Button [PWRF] Jul 15 05:10:54.283763 kernel: mousedev: PS/2 mouse device common for all mice Jul 15 05:10:54.302091 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jul 15 05:10:54.302364 systemd[1]: Reloading finished in 285 ms. Jul 15 05:10:54.308838 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:10:54.309602 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 15 05:10:54.310010 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:10:54.323067 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:10:54.325801 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 15 05:10:54.327975 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 15 05:10:54.330277 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:10:54.339378 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:10:54.340935 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 15 05:10:54.348794 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 15 05:10:54.349893 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:10:54.352186 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:10:54.353530 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:10:54.358357 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:10:54.358524 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:10:54.358589 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:10:54.358649 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:10:54.365505 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:10:54.366823 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:10:54.366884 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:10:54.366942 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:10:54.369144 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:10:54.370862 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:10:54.371065 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:10:54.371135 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:10:54.371241 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:10:54.372823 systemd[1]: Finished ensure-sysext.service. Jul 15 05:10:54.374824 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 15 05:10:54.392016 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 15 05:10:54.394908 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:10:54.395303 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:10:54.413559 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jul 15 05:10:54.414713 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 15 05:10:54.417207 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:10:54.421262 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:10:54.421892 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:10:54.422139 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:10:54.422935 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:10:54.423129 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:10:54.426742 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:10:54.426916 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:10:54.430318 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 15 05:10:54.432180 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 15 05:10:54.442248 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 15 05:10:54.449218 augenrules[1574]: No rules Jul 15 05:10:54.450908 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 15 05:10:54.452578 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:10:54.453253 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:10:54.455141 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 15 05:10:54.468125 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 15 05:10:54.468360 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 05:10:54.512716 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Jul 15 05:10:54.526915 systemd-networkd[1532]: lo: Link UP Jul 15 05:10:54.526920 systemd-networkd[1532]: lo: Gained carrier Jul 15 05:10:54.528443 systemd-networkd[1532]: Enumeration completed Jul 15 05:10:54.529405 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:10:54.529652 systemd-networkd[1532]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Jul 15 05:10:54.533390 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jul 15 05:10:54.533548 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jul 15 05:10:54.531775 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 15 05:10:54.537847 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 15 05:10:54.540177 systemd-networkd[1532]: ens192: Link UP Jul 15 05:10:54.540317 systemd-networkd[1532]: ens192: Gained carrier Jul 15 05:10:54.540599 systemd-resolved[1533]: Positive Trust Anchors: Jul 15 05:10:54.540607 systemd-resolved[1533]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:10:54.540632 systemd-resolved[1533]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:10:54.542733 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 15 05:10:54.542902 systemd[1]: Reached target time-set.target - System Time Set. Jul 15 05:10:54.543988 systemd-resolved[1533]: Defaulting to hostname 'linux'. Jul 15 05:10:54.545933 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:10:54.546073 systemd[1]: Reached target network.target - Network. Jul 15 05:10:54.546161 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:10:54.546272 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:10:54.546413 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 15 05:10:54.546531 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 15 05:10:54.546638 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jul 15 05:10:54.546792 systemd-timesyncd[1543]: Network configuration changed, trying to establish connection. Jul 15 05:10:54.546830 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 15 05:10:54.546974 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 15 05:10:54.547086 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 15 05:10:54.547192 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 15 05:10:54.547210 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:10:54.547297 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:10:54.548334 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 15 05:10:54.550201 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 15 05:10:54.551591 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 15 05:10:54.552590 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 15 05:10:54.552850 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 15 05:10:54.555826 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 15 05:10:54.556128 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 15 05:10:54.556793 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 15 05:10:54.557521 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:10:54.557722 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:10:54.557936 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:10:54.557959 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:10:54.559061 systemd[1]: Starting containerd.service - containerd container runtime... Jul 15 05:10:54.560092 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 15 05:10:54.564584 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 15 05:10:54.566756 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 15 05:10:54.568802 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 15 05:10:54.568925 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 15 05:10:54.569646 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jul 15 05:10:54.570786 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 15 05:10:54.574361 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 15 05:10:54.578648 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 15 05:10:54.582725 jq[1599]: false Jul 15 05:10:54.582915 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 15 05:10:54.586825 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 15 05:10:54.587469 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 15 05:10:54.588036 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 15 05:10:54.590257 systemd[1]: Starting update-engine.service - Update Engine... Jul 15 05:10:54.593055 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 15 05:10:54.593538 google_oslogin_nss_cache[1601]: oslogin_cache_refresh[1601]: Refreshing passwd entry cache Jul 15 05:10:54.594731 oslogin_cache_refresh[1601]: Refreshing passwd entry cache Jul 15 05:10:54.599802 google_oslogin_nss_cache[1601]: oslogin_cache_refresh[1601]: Failure getting users, quitting Jul 15 05:10:54.599844 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Jul 15 05:10:54.600203 oslogin_cache_refresh[1601]: Failure getting users, quitting Jul 15 05:10:54.600606 google_oslogin_nss_cache[1601]: oslogin_cache_refresh[1601]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:10:54.600606 google_oslogin_nss_cache[1601]: oslogin_cache_refresh[1601]: Refreshing group entry cache Jul 15 05:10:54.600220 oslogin_cache_refresh[1601]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:10:54.600250 oslogin_cache_refresh[1601]: Refreshing group entry cache Jul 15 05:10:54.601377 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 15 05:10:54.602287 extend-filesystems[1600]: Found /dev/sda6 Jul 15 05:10:54.605289 google_oslogin_nss_cache[1601]: oslogin_cache_refresh[1601]: Failure getting groups, quitting Jul 15 05:10:54.605289 google_oslogin_nss_cache[1601]: oslogin_cache_refresh[1601]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:10:54.604610 oslogin_cache_refresh[1601]: Failure getting groups, quitting Jul 15 05:10:54.604618 oslogin_cache_refresh[1601]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:10:54.609906 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 15 05:10:54.611231 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 15 05:10:54.611379 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 15 05:10:54.611531 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jul 15 05:10:54.611649 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jul 15 05:10:54.613088 extend-filesystems[1600]: Found /dev/sda9 Jul 15 05:10:54.615811 extend-filesystems[1600]: Checking size of /dev/sda9 Jul 15 05:12:21.297324 systemd-timesyncd[1543]: Contacted time server 23.186.168.132:123 (0.flatcar.pool.ntp.org). Jul 15 05:12:21.297360 systemd-timesyncd[1543]: Initial clock synchronization to Tue 2025-07-15 05:12:21.297253 UTC. Jul 15 05:12:21.297839 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 15 05:12:21.298032 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 15 05:12:21.298926 systemd-resolved[1533]: Clock change detected. Flushing caches. Jul 15 05:12:21.299516 extend-filesystems[1600]: Old size kept for /dev/sda9 Jul 15 05:12:21.300654 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 15 05:12:21.301286 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 15 05:12:21.307730 jq[1609]: true Jul 15 05:12:21.318473 (ntainerd)[1622]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 15 05:12:21.322364 systemd[1]: motdgen.service: Deactivated successfully. Jul 15 05:12:21.322772 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 15 05:12:21.326631 update_engine[1607]: I20250715 05:12:21.326572 1607 main.cc:92] Flatcar Update Engine starting Jul 15 05:12:21.334758 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Jul 15 05:12:21.337065 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Jul 15 05:12:21.349601 jq[1637]: true Jul 15 05:12:21.372037 tar[1618]: linux-amd64/helm Jul 15 05:12:21.408036 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Jul 15 05:12:21.417124 dbus-daemon[1597]: [system] SELinux support is enabled Jul 15 05:12:21.417220 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 15 05:12:21.420096 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 15 05:12:21.420115 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 15 05:12:21.420260 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 15 05:12:21.420272 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 15 05:12:21.435328 systemd[1]: Started update-engine.service - Update Engine. Jul 15 05:12:21.437914 update_engine[1607]: I20250715 05:12:21.435979 1607 update_check_scheduler.cc:74] Next update check in 6m19s Jul 15 05:12:21.453851 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 15 05:12:21.456884 (udev-worker)[1450]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jul 15 05:12:21.470189 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:12:21.476634 systemd-logind[1606]: Watching system buttons on /dev/input/event2 (Power Button) Jul 15 05:12:21.476648 systemd-logind[1606]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jul 15 05:12:21.476770 systemd-logind[1606]: New seat seat0. Jul 15 05:12:21.477123 systemd[1]: Started systemd-logind.service - User Login Management. Jul 15 05:12:21.496838 unknown[1645]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Jul 15 05:12:21.513885 unknown[1645]: Core dump limit set to -1 Jul 15 05:12:21.568558 bash[1680]: Updated "/home/core/.ssh/authorized_keys" Jul 15 05:12:21.570136 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 15 05:12:21.572409 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jul 15 05:12:21.741268 locksmithd[1679]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 15 05:12:21.778350 sshd_keygen[1656]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 15 05:12:21.808444 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 15 05:12:21.819194 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 15 05:12:21.822988 containerd[1622]: time="2025-07-15T05:12:21Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 15 05:12:21.822988 containerd[1622]: time="2025-07-15T05:12:21.821127991Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Jul 15 05:12:21.834442 containerd[1622]: time="2025-07-15T05:12:21.834417768Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.277µs" Jul 15 05:12:21.835255 containerd[1622]: time="2025-07-15T05:12:21.835240319Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 15 05:12:21.835306 containerd[1622]: time="2025-07-15T05:12:21.835298119Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 15 05:12:21.835429 containerd[1622]: time="2025-07-15T05:12:21.835419444Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 15 05:12:21.835874 containerd[1622]: time="2025-07-15T05:12:21.835864526Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 15 05:12:21.835947 containerd[1622]: time="2025-07-15T05:12:21.835938899Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:12:21.836018 containerd[1622]: time="2025-07-15T05:12:21.836008210Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:12:21.836248 containerd[1622]: time="2025-07-15T05:12:21.836239549Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:12:21.836416 containerd[1622]: time="2025-07-15T05:12:21.836403649Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:12:21.837366 containerd[1622]: time="2025-07-15T05:12:21.837354268Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:12:21.837416 containerd[1622]: time="2025-07-15T05:12:21.837406768Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:12:21.837657 containerd[1622]: time="2025-07-15T05:12:21.837648464Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 15 05:12:21.837734 containerd[1622]: time="2025-07-15T05:12:21.837725187Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 15 05:12:21.838036 containerd[1622]: time="2025-07-15T05:12:21.838025462Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:12:21.838655 containerd[1622]: time="2025-07-15T05:12:21.838642076Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:12:21.838693 containerd[1622]: time="2025-07-15T05:12:21.838685785Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 15 05:12:21.838790 containerd[1622]: time="2025-07-15T05:12:21.838780801Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 15 05:12:21.839043 containerd[1622]: time="2025-07-15T05:12:21.839033065Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 15 05:12:21.839623 containerd[1622]: time="2025-07-15T05:12:21.839613085Z" level=info msg="metadata content store policy set" policy=shared Jul 15 05:12:21.843950 containerd[1622]: time="2025-07-15T05:12:21.843936481Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 15 05:12:21.844014 containerd[1622]: time="2025-07-15T05:12:21.844005322Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845354871Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845370366Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845379755Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845386419Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845393418Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845400004Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845406675Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845412268Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845417778Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845427851Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845487427Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845499370Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845507832Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 15 05:12:21.845624 containerd[1622]: time="2025-07-15T05:12:21.845513868Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845524952Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845531130Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845537886Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845545355Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845552615Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845558633Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845564407Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845601310Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 15 05:12:21.845815 containerd[1622]: time="2025-07-15T05:12:21.845610723Z" level=info msg="Start snapshots syncer" Jul 15 05:12:21.845997 containerd[1622]: time="2025-07-15T05:12:21.845987707Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 15 05:12:21.851723 systemd[1]: issuegen.service: Deactivated successfully. Jul 15 05:12:21.851881 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 15 05:12:21.852144 containerd[1622]: time="2025-07-15T05:12:21.852019661Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852087404Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852347434Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852415202Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852428813Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852435672Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852441629Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852448486Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852454817Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852461086Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852475674Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852482623Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852489143Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852509644Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:12:21.853216 containerd[1622]: time="2025-07-15T05:12:21.852518755Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852523580Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852529357Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852533686Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852538627Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852551148Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852561360Z" level=info msg="runtime interface created" Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852578972Z" level=info msg="created NRI interface" Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852599579Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852616471Z" level=info msg="Connect containerd service" Jul 15 05:12:21.853416 containerd[1622]: time="2025-07-15T05:12:21.852632488Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 15 05:12:21.854490 containerd[1622]: time="2025-07-15T05:12:21.854477536Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 05:12:21.858656 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 15 05:12:21.866626 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:12:21.878580 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 15 05:12:21.881243 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 15 05:12:21.883308 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jul 15 05:12:21.883757 systemd[1]: Reached target getty.target - Login Prompts. Jul 15 05:12:21.898919 tar[1618]: linux-amd64/LICENSE Jul 15 05:12:21.899129 tar[1618]: linux-amd64/README.md Jul 15 05:12:21.917770 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 15 05:12:22.005016 containerd[1622]: time="2025-07-15T05:12:22.004955163Z" level=info msg="Start subscribing containerd event" Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005097888Z" level=info msg="Start recovering state" Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005157664Z" level=info msg="Start event monitor" Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005166478Z" level=info msg="Start cni network conf syncer for default" Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005171809Z" level=info msg="Start streaming server" Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005176516Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005180289Z" level=info msg="runtime interface starting up..." Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005183477Z" level=info msg="starting plugins..." Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005190598Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.004984798Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 15 05:12:22.005313 containerd[1622]: time="2025-07-15T05:12:22.005274148Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 15 05:12:22.005390 systemd[1]: Started containerd.service - containerd container runtime. Jul 15 05:12:22.006096 containerd[1622]: time="2025-07-15T05:12:22.006076165Z" level=info msg="containerd successfully booted in 0.186778s" Jul 15 05:12:22.520023 systemd-networkd[1532]: ens192: Gained IPv6LL Jul 15 05:12:22.521379 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 15 05:12:22.522545 systemd[1]: Reached target network-online.target - Network is Online. Jul 15 05:12:22.523971 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Jul 15 05:12:22.525236 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:12:22.530398 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 15 05:12:22.554958 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 15 05:12:22.563462 systemd[1]: coreos-metadata.service: Deactivated successfully. Jul 15 05:12:22.563629 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Jul 15 05:12:22.564091 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 15 05:12:23.430667 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:12:23.431208 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 15 05:12:23.432038 systemd[1]: Startup finished in 2.837s (kernel) + 6.733s (initrd) + 4.464s (userspace) = 14.035s. Jul 15 05:12:23.441321 (kubelet)[1816]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:12:23.470957 login[1764]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jul 15 05:12:23.472342 login[1766]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jul 15 05:12:23.477073 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 15 05:12:23.477680 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 15 05:12:23.482754 systemd-logind[1606]: New session 1 of user core. Jul 15 05:12:23.485055 systemd-logind[1606]: New session 2 of user core. Jul 15 05:12:23.503486 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 15 05:12:23.505621 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 15 05:12:23.516722 (systemd)[1823]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 15 05:12:23.519373 systemd-logind[1606]: New session c1 of user core. Jul 15 05:12:23.725508 systemd[1823]: Queued start job for default target default.target. Jul 15 05:12:23.730802 systemd[1823]: Created slice app.slice - User Application Slice. Jul 15 05:12:23.730838 systemd[1823]: Reached target paths.target - Paths. Jul 15 05:12:23.730977 systemd[1823]: Reached target timers.target - Timers. Jul 15 05:12:23.732111 systemd[1823]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 15 05:12:23.753858 systemd[1823]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 15 05:12:23.753958 systemd[1823]: Reached target sockets.target - Sockets. Jul 15 05:12:23.754000 systemd[1823]: Reached target basic.target - Basic System. Jul 15 05:12:23.754032 systemd[1823]: Reached target default.target - Main User Target. Jul 15 05:12:23.754050 systemd[1823]: Startup finished in 230ms. Jul 15 05:12:23.754192 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 15 05:12:23.761159 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 15 05:12:23.761933 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 15 05:12:24.148156 kubelet[1816]: E0715 05:12:24.148086 1816 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:12:24.149654 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:12:24.149736 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:12:24.150137 systemd[1]: kubelet.service: Consumed 630ms CPU time, 264.4M memory peak. Jul 15 05:12:34.400127 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 15 05:12:34.402986 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:12:34.827453 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:12:34.837125 (kubelet)[1870]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:12:34.884829 kubelet[1870]: E0715 05:12:34.884793 1870 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:12:34.887201 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:12:34.887344 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:12:34.887678 systemd[1]: kubelet.service: Consumed 97ms CPU time, 108.7M memory peak. Jul 15 05:12:45.118766 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 15 05:12:45.120159 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:12:45.209949 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:12:45.215123 (kubelet)[1885]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:12:45.245513 kubelet[1885]: E0715 05:12:45.245479 1885 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:12:45.247025 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:12:45.247183 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:12:45.247621 systemd[1]: kubelet.service: Consumed 98ms CPU time, 110.5M memory peak. Jul 15 05:12:51.747047 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 15 05:12:51.748382 systemd[1]: Started sshd@0-139.178.70.102:22-147.75.109.163:51158.service - OpenSSH per-connection server daemon (147.75.109.163:51158). Jul 15 05:12:51.821361 sshd[1893]: Accepted publickey for core from 147.75.109.163 port 51158 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:12:51.821822 sshd-session[1893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:12:51.825762 systemd-logind[1606]: New session 3 of user core. Jul 15 05:12:51.839111 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 15 05:12:51.894844 systemd[1]: Started sshd@1-139.178.70.102:22-147.75.109.163:51162.service - OpenSSH per-connection server daemon (147.75.109.163:51162). Jul 15 05:12:51.953158 sshd[1899]: Accepted publickey for core from 147.75.109.163 port 51162 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:12:51.954381 sshd-session[1899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:12:51.959149 systemd-logind[1606]: New session 4 of user core. Jul 15 05:12:51.968154 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 15 05:12:52.019044 sshd[1902]: Connection closed by 147.75.109.163 port 51162 Jul 15 05:12:52.019591 sshd-session[1899]: pam_unix(sshd:session): session closed for user core Jul 15 05:12:52.030369 systemd[1]: sshd@1-139.178.70.102:22-147.75.109.163:51162.service: Deactivated successfully. Jul 15 05:12:52.031628 systemd[1]: session-4.scope: Deactivated successfully. Jul 15 05:12:52.032663 systemd-logind[1606]: Session 4 logged out. Waiting for processes to exit. Jul 15 05:12:52.034277 systemd[1]: Started sshd@2-139.178.70.102:22-147.75.109.163:51168.service - OpenSSH per-connection server daemon (147.75.109.163:51168). Jul 15 05:12:52.034972 systemd-logind[1606]: Removed session 4. Jul 15 05:12:52.080533 sshd[1908]: Accepted publickey for core from 147.75.109.163 port 51168 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:12:52.081540 sshd-session[1908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:12:52.084570 systemd-logind[1606]: New session 5 of user core. Jul 15 05:12:52.091077 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 15 05:12:52.137543 sshd[1911]: Connection closed by 147.75.109.163 port 51168 Jul 15 05:12:52.137916 sshd-session[1908]: pam_unix(sshd:session): session closed for user core Jul 15 05:12:52.145184 systemd[1]: sshd@2-139.178.70.102:22-147.75.109.163:51168.service: Deactivated successfully. Jul 15 05:12:52.146487 systemd[1]: session-5.scope: Deactivated successfully. Jul 15 05:12:52.147450 systemd-logind[1606]: Session 5 logged out. Waiting for processes to exit. Jul 15 05:12:52.148832 systemd[1]: Started sshd@3-139.178.70.102:22-147.75.109.163:51170.service - OpenSSH per-connection server daemon (147.75.109.163:51170). Jul 15 05:12:52.150519 systemd-logind[1606]: Removed session 5. Jul 15 05:12:52.194197 sshd[1917]: Accepted publickey for core from 147.75.109.163 port 51170 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:12:52.195023 sshd-session[1917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:12:52.197878 systemd-logind[1606]: New session 6 of user core. Jul 15 05:12:52.212110 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 15 05:12:52.261154 sshd[1920]: Connection closed by 147.75.109.163 port 51170 Jul 15 05:12:52.261457 sshd-session[1917]: pam_unix(sshd:session): session closed for user core Jul 15 05:12:52.272867 systemd[1]: sshd@3-139.178.70.102:22-147.75.109.163:51170.service: Deactivated successfully. Jul 15 05:12:52.274498 systemd[1]: session-6.scope: Deactivated successfully. Jul 15 05:12:52.275247 systemd-logind[1606]: Session 6 logged out. Waiting for processes to exit. Jul 15 05:12:52.276787 systemd[1]: Started sshd@4-139.178.70.102:22-147.75.109.163:51184.service - OpenSSH per-connection server daemon (147.75.109.163:51184). Jul 15 05:12:52.278649 systemd-logind[1606]: Removed session 6. Jul 15 05:12:52.322290 sshd[1926]: Accepted publickey for core from 147.75.109.163 port 51184 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:12:52.323367 sshd-session[1926]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:12:52.327181 systemd-logind[1606]: New session 7 of user core. Jul 15 05:12:52.337115 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 15 05:12:52.394686 sudo[1930]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 15 05:12:52.394859 sudo[1930]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:12:52.404268 sudo[1930]: pam_unix(sudo:session): session closed for user root Jul 15 05:12:52.405200 sshd[1929]: Connection closed by 147.75.109.163 port 51184 Jul 15 05:12:52.405578 sshd-session[1926]: pam_unix(sshd:session): session closed for user core Jul 15 05:12:52.413256 systemd[1]: sshd@4-139.178.70.102:22-147.75.109.163:51184.service: Deactivated successfully. Jul 15 05:12:52.414375 systemd[1]: session-7.scope: Deactivated successfully. Jul 15 05:12:52.415465 systemd-logind[1606]: Session 7 logged out. Waiting for processes to exit. Jul 15 05:12:52.417300 systemd[1]: Started sshd@5-139.178.70.102:22-147.75.109.163:51194.service - OpenSSH per-connection server daemon (147.75.109.163:51194). Jul 15 05:12:52.418236 systemd-logind[1606]: Removed session 7. Jul 15 05:12:52.461487 sshd[1936]: Accepted publickey for core from 147.75.109.163 port 51194 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:12:52.462368 sshd-session[1936]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:12:52.465342 systemd-logind[1606]: New session 8 of user core. Jul 15 05:12:52.474049 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 15 05:12:52.523378 sudo[1941]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 15 05:12:52.523935 sudo[1941]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:12:52.526844 sudo[1941]: pam_unix(sudo:session): session closed for user root Jul 15 05:12:52.530384 sudo[1940]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 15 05:12:52.530542 sudo[1940]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:12:52.537153 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:12:52.565130 augenrules[1963]: No rules Jul 15 05:12:52.566204 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:12:52.566487 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:12:52.567431 sudo[1940]: pam_unix(sudo:session): session closed for user root Jul 15 05:12:52.568278 sshd[1939]: Connection closed by 147.75.109.163 port 51194 Jul 15 05:12:52.568643 sshd-session[1936]: pam_unix(sshd:session): session closed for user core Jul 15 05:12:52.585287 systemd[1]: sshd@5-139.178.70.102:22-147.75.109.163:51194.service: Deactivated successfully. Jul 15 05:12:52.586341 systemd[1]: session-8.scope: Deactivated successfully. Jul 15 05:12:52.586941 systemd-logind[1606]: Session 8 logged out. Waiting for processes to exit. Jul 15 05:12:52.588288 systemd[1]: Started sshd@6-139.178.70.102:22-147.75.109.163:51198.service - OpenSSH per-connection server daemon (147.75.109.163:51198). Jul 15 05:12:52.589935 systemd-logind[1606]: Removed session 8. Jul 15 05:12:52.640499 sshd[1972]: Accepted publickey for core from 147.75.109.163 port 51198 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:12:52.641716 sshd-session[1972]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:12:52.645607 systemd-logind[1606]: New session 9 of user core. Jul 15 05:12:52.656130 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 15 05:12:52.706092 sudo[1976]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 15 05:12:52.706480 sudo[1976]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:12:53.380732 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 15 05:12:53.393197 (dockerd)[1993]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 15 05:12:53.767293 dockerd[1993]: time="2025-07-15T05:12:53.767048649Z" level=info msg="Starting up" Jul 15 05:12:53.769444 dockerd[1993]: time="2025-07-15T05:12:53.769411786Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 15 05:12:53.778605 dockerd[1993]: time="2025-07-15T05:12:53.778506193Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jul 15 05:12:53.807438 dockerd[1993]: time="2025-07-15T05:12:53.807400159Z" level=info msg="Loading containers: start." Jul 15 05:12:53.818922 kernel: Initializing XFRM netlink socket Jul 15 05:12:53.993553 systemd-networkd[1532]: docker0: Link UP Jul 15 05:12:53.995799 dockerd[1993]: time="2025-07-15T05:12:53.995767354Z" level=info msg="Loading containers: done." Jul 15 05:12:54.007223 dockerd[1993]: time="2025-07-15T05:12:54.006978058Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 15 05:12:54.007223 dockerd[1993]: time="2025-07-15T05:12:54.007045428Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jul 15 05:12:54.007223 dockerd[1993]: time="2025-07-15T05:12:54.007102282Z" level=info msg="Initializing buildkit" Jul 15 05:12:54.020953 dockerd[1993]: time="2025-07-15T05:12:54.020830411Z" level=info msg="Completed buildkit initialization" Jul 15 05:12:54.027554 dockerd[1993]: time="2025-07-15T05:12:54.027500692Z" level=info msg="Daemon has completed initialization" Jul 15 05:12:54.027656 dockerd[1993]: time="2025-07-15T05:12:54.027626525Z" level=info msg="API listen on /run/docker.sock" Jul 15 05:12:54.027772 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 15 05:12:54.770000 containerd[1622]: time="2025-07-15T05:12:54.769935556Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\"" Jul 15 05:12:55.305962 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 15 05:12:55.308948 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:12:55.316656 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2640073767.mount: Deactivated successfully. Jul 15 05:12:55.571791 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:12:55.574730 (kubelet)[2227]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:12:55.605402 kubelet[2227]: E0715 05:12:55.605360 2227 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:12:55.607029 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:12:55.607432 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:12:55.607914 systemd[1]: kubelet.service: Consumed 98ms CPU time, 108.4M memory peak. Jul 15 05:12:56.668497 containerd[1622]: time="2025-07-15T05:12:56.668257601Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:56.668780 containerd[1622]: time="2025-07-15T05:12:56.668762896Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.10: active requests=0, bytes read=28077744" Jul 15 05:12:56.669403 containerd[1622]: time="2025-07-15T05:12:56.668954707Z" level=info msg="ImageCreate event name:\"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:56.670331 containerd[1622]: time="2025-07-15T05:12:56.670309766Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:56.670911 containerd[1622]: time="2025-07-15T05:12:56.670833405Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.10\" with image id \"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\", size \"28074544\" in 1.90087579s" Jul 15 05:12:56.670911 containerd[1622]: time="2025-07-15T05:12:56.670850120Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\" returns image reference \"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\"" Jul 15 05:12:56.671309 containerd[1622]: time="2025-07-15T05:12:56.671292049Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\"" Jul 15 05:12:58.013011 containerd[1622]: time="2025-07-15T05:12:58.012917990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:58.019204 containerd[1622]: time="2025-07-15T05:12:58.019164428Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.10: active requests=0, bytes read=24713294" Jul 15 05:12:58.025195 containerd[1622]: time="2025-07-15T05:12:58.025165356Z" level=info msg="ImageCreate event name:\"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:58.037496 containerd[1622]: time="2025-07-15T05:12:58.037460329Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:58.038667 containerd[1622]: time="2025-07-15T05:12:58.038595307Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.10\" with image id \"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\", size \"26315128\" in 1.367283799s" Jul 15 05:12:58.038667 containerd[1622]: time="2025-07-15T05:12:58.038612008Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\" returns image reference \"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\"" Jul 15 05:12:58.038888 containerd[1622]: time="2025-07-15T05:12:58.038877308Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\"" Jul 15 05:12:59.085123 containerd[1622]: time="2025-07-15T05:12:59.085091140Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:59.085910 containerd[1622]: time="2025-07-15T05:12:59.085887396Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.10: active requests=0, bytes read=18783671" Jul 15 05:12:59.086498 containerd[1622]: time="2025-07-15T05:12:59.086479449Z" level=info msg="ImageCreate event name:\"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:59.087782 containerd[1622]: time="2025-07-15T05:12:59.087756140Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:12:59.088500 containerd[1622]: time="2025-07-15T05:12:59.088415297Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.10\" with image id \"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\", size \"20385523\" in 1.049522706s" Jul 15 05:12:59.088500 containerd[1622]: time="2025-07-15T05:12:59.088433723Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\" returns image reference \"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\"" Jul 15 05:12:59.088837 containerd[1622]: time="2025-07-15T05:12:59.088805674Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\"" Jul 15 05:13:00.064533 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2035387062.mount: Deactivated successfully. Jul 15 05:13:00.572410 containerd[1622]: time="2025-07-15T05:13:00.571970724Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:00.576853 containerd[1622]: time="2025-07-15T05:13:00.576826689Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.10: active requests=0, bytes read=30383943" Jul 15 05:13:00.582044 containerd[1622]: time="2025-07-15T05:13:00.581998671Z" level=info msg="ImageCreate event name:\"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:00.587034 containerd[1622]: time="2025-07-15T05:13:00.587004421Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:00.587609 containerd[1622]: time="2025-07-15T05:13:00.587499545Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.10\" with image id \"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\", repo tag \"registry.k8s.io/kube-proxy:v1.31.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\", size \"30382962\" in 1.498620221s" Jul 15 05:13:00.587609 containerd[1622]: time="2025-07-15T05:13:00.587523716Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\" returns image reference \"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\"" Jul 15 05:13:00.587862 containerd[1622]: time="2025-07-15T05:13:00.587845494Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 15 05:13:01.300987 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3525057608.mount: Deactivated successfully. Jul 15 05:13:02.630127 containerd[1622]: time="2025-07-15T05:13:02.630085132Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:02.631037 containerd[1622]: time="2025-07-15T05:13:02.630977589Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Jul 15 05:13:02.635729 containerd[1622]: time="2025-07-15T05:13:02.635652496Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:02.641032 containerd[1622]: time="2025-07-15T05:13:02.640982283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:02.641446 containerd[1622]: time="2025-07-15T05:13:02.641333047Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.053463635s" Jul 15 05:13:02.641446 containerd[1622]: time="2025-07-15T05:13:02.641356552Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jul 15 05:13:02.641785 containerd[1622]: time="2025-07-15T05:13:02.641770188Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 15 05:13:03.529158 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount803114319.mount: Deactivated successfully. Jul 15 05:13:03.595504 containerd[1622]: time="2025-07-15T05:13:03.595388636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:13:03.603800 containerd[1622]: time="2025-07-15T05:13:03.603746675Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Jul 15 05:13:03.613007 containerd[1622]: time="2025-07-15T05:13:03.612959122Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:13:03.622738 containerd[1622]: time="2025-07-15T05:13:03.622680754Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:13:03.623411 containerd[1622]: time="2025-07-15T05:13:03.623274711Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 981.074276ms" Jul 15 05:13:03.623411 containerd[1622]: time="2025-07-15T05:13:03.623302023Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jul 15 05:13:03.623728 containerd[1622]: time="2025-07-15T05:13:03.623650903Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jul 15 05:13:04.444634 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount993287112.mount: Deactivated successfully. Jul 15 05:13:05.618771 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jul 15 05:13:05.619871 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:13:06.369426 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:13:06.374131 (kubelet)[2371]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:13:06.400411 update_engine[1607]: I20250715 05:13:06.400357 1607 update_attempter.cc:509] Updating boot flags... Jul 15 05:13:06.483148 kubelet[2371]: E0715 05:13:06.483110 2371 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:13:06.485024 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:13:06.485181 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:13:06.485554 systemd[1]: kubelet.service: Consumed 110ms CPU time, 111.5M memory peak. Jul 15 05:13:08.866914 containerd[1622]: time="2025-07-15T05:13:08.866864883Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:08.869351 containerd[1622]: time="2025-07-15T05:13:08.869326622Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780013" Jul 15 05:13:08.872369 containerd[1622]: time="2025-07-15T05:13:08.872319723Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:08.880062 containerd[1622]: time="2025-07-15T05:13:08.880013831Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:08.880971 containerd[1622]: time="2025-07-15T05:13:08.880845882Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 5.257176987s" Jul 15 05:13:08.880971 containerd[1622]: time="2025-07-15T05:13:08.880869954Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Jul 15 05:13:11.015091 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:13:11.015256 systemd[1]: kubelet.service: Consumed 110ms CPU time, 111.5M memory peak. Jul 15 05:13:11.017265 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:13:11.038374 systemd[1]: Reload requested from client PID 2459 ('systemctl') (unit session-9.scope)... Jul 15 05:13:11.038393 systemd[1]: Reloading... Jul 15 05:13:11.116926 zram_generator::config[2502]: No configuration found. Jul 15 05:13:11.187354 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:13:11.195549 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 15 05:13:11.262757 systemd[1]: Reloading finished in 224 ms. Jul 15 05:13:11.376600 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 15 05:13:11.376680 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 15 05:13:11.377182 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:13:11.379163 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:13:12.650698 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:13:12.663291 (kubelet)[2570]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:13:12.703031 kubelet[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:13:12.703935 kubelet[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 15 05:13:12.703935 kubelet[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:13:12.703935 kubelet[2570]: I0715 05:13:12.703347 2570 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:13:12.998156 kubelet[2570]: I0715 05:13:12.998090 2570 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 15 05:13:12.998354 kubelet[2570]: I0715 05:13:12.998347 2570 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:13:12.998683 kubelet[2570]: I0715 05:13:12.998673 2570 server.go:934] "Client rotation is on, will bootstrap in background" Jul 15 05:13:13.553919 kubelet[2570]: I0715 05:13:13.553351 2570 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:13:13.596063 kubelet[2570]: E0715 05:13:13.596027 2570 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.102:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:13.640673 kubelet[2570]: I0715 05:13:13.640570 2570 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:13:13.665348 kubelet[2570]: I0715 05:13:13.665197 2570 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:13:13.692409 kubelet[2570]: I0715 05:13:13.691662 2570 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 15 05:13:13.692409 kubelet[2570]: I0715 05:13:13.691807 2570 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:13:13.692409 kubelet[2570]: I0715 05:13:13.691833 2570 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:13:13.692409 kubelet[2570]: I0715 05:13:13.691970 2570 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:13:13.692605 kubelet[2570]: I0715 05:13:13.691977 2570 container_manager_linux.go:300] "Creating device plugin manager" Jul 15 05:13:13.706937 kubelet[2570]: I0715 05:13:13.706857 2570 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:13:13.776590 kubelet[2570]: I0715 05:13:13.776554 2570 kubelet.go:408] "Attempting to sync node with API server" Jul 15 05:13:13.776590 kubelet[2570]: I0715 05:13:13.776589 2570 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:13:13.784115 kubelet[2570]: I0715 05:13:13.784086 2570 kubelet.go:314] "Adding apiserver pod source" Jul 15 05:13:13.784115 kubelet[2570]: I0715 05:13:13.784118 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:13:13.821536 kubelet[2570]: W0715 05:13:13.821428 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:13.821536 kubelet[2570]: E0715 05:13:13.821492 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:13.821660 kubelet[2570]: W0715 05:13:13.821636 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:13.821660 kubelet[2570]: E0715 05:13:13.821657 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:13.837677 kubelet[2570]: I0715 05:13:13.837646 2570 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:13:13.861469 kubelet[2570]: I0715 05:13:13.861401 2570 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 05:13:13.867782 kubelet[2570]: W0715 05:13:13.867760 2570 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 15 05:13:13.881716 kubelet[2570]: I0715 05:13:13.881593 2570 server.go:1274] "Started kubelet" Jul 15 05:13:13.930145 kubelet[2570]: I0715 05:13:13.930124 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:13:13.941005 kubelet[2570]: I0715 05:13:13.940960 2570 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:13:13.951762 kubelet[2570]: I0715 05:13:13.951698 2570 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:13:13.954992 kubelet[2570]: I0715 05:13:13.954937 2570 server.go:449] "Adding debug handlers to kubelet server" Jul 15 05:13:13.964923 kubelet[2570]: I0715 05:13:13.964892 2570 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 15 05:13:13.965234 kubelet[2570]: E0715 05:13:13.965223 2570 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jul 15 05:13:13.965666 kubelet[2570]: I0715 05:13:13.965656 2570 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 15 05:13:13.965753 kubelet[2570]: I0715 05:13:13.965746 2570 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:13:13.965996 kubelet[2570]: I0715 05:13:13.965978 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:13:13.966490 kubelet[2570]: I0715 05:13:13.966311 2570 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:13:13.993052 kubelet[2570]: W0715 05:13:13.993020 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:13.993467 kubelet[2570]: E0715 05:13:13.993115 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:13.993467 kubelet[2570]: E0715 05:13:13.993157 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="200ms" Jul 15 05:13:13.993467 kubelet[2570]: E0715 05:13:13.966406 2570 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.102:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.102:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.185254c13a269c2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-07-15 05:13:13.881574446 +0000 UTC m=+1.215942085,LastTimestamp:2025-07-15 05:13:13.881574446 +0000 UTC m=+1.215942085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jul 15 05:13:14.023866 kubelet[2570]: I0715 05:13:14.023836 2570 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 05:13:14.025020 kubelet[2570]: I0715 05:13:14.024720 2570 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 05:13:14.025020 kubelet[2570]: I0715 05:13:14.024738 2570 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 15 05:13:14.025020 kubelet[2570]: I0715 05:13:14.024760 2570 kubelet.go:2321] "Starting kubelet main sync loop" Jul 15 05:13:14.025020 kubelet[2570]: E0715 05:13:14.024790 2570 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:13:14.026812 kubelet[2570]: W0715 05:13:14.026795 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:14.027143 kubelet[2570]: E0715 05:13:14.027130 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:14.047606 kubelet[2570]: I0715 05:13:14.047583 2570 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:13:14.051022 kubelet[2570]: I0715 05:13:14.051005 2570 factory.go:221] Registration of the containerd container factory successfully Jul 15 05:13:14.051022 kubelet[2570]: I0715 05:13:14.051017 2570 factory.go:221] Registration of the systemd container factory successfully Jul 15 05:13:14.073229 kubelet[2570]: E0715 05:13:14.073132 2570 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jul 15 05:13:14.098748 kubelet[2570]: I0715 05:13:14.098726 2570 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 15 05:13:14.098748 kubelet[2570]: I0715 05:13:14.098737 2570 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 15 05:13:14.098748 kubelet[2570]: I0715 05:13:14.098746 2570 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:13:14.125930 kubelet[2570]: E0715 05:13:14.125883 2570 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jul 15 05:13:14.173514 kubelet[2570]: E0715 05:13:14.173480 2570 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jul 15 05:13:14.194013 kubelet[2570]: E0715 05:13:14.193971 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="400ms" Jul 15 05:13:14.274325 kubelet[2570]: E0715 05:13:14.274298 2570 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jul 15 05:13:14.326649 kubelet[2570]: E0715 05:13:14.326564 2570 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jul 15 05:13:14.336449 kubelet[2570]: I0715 05:13:14.336387 2570 policy_none.go:49] "None policy: Start" Jul 15 05:13:14.337010 kubelet[2570]: I0715 05:13:14.336995 2570 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 15 05:13:14.337086 kubelet[2570]: I0715 05:13:14.337067 2570 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:13:14.374984 kubelet[2570]: E0715 05:13:14.374943 2570 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jul 15 05:13:14.408002 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 15 05:13:14.416794 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 15 05:13:14.419492 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 15 05:13:14.428726 kubelet[2570]: I0715 05:13:14.428638 2570 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 05:13:14.428812 kubelet[2570]: I0715 05:13:14.428761 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:13:14.428812 kubelet[2570]: I0715 05:13:14.428767 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:13:14.429203 kubelet[2570]: I0715 05:13:14.429150 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:13:14.430678 kubelet[2570]: E0715 05:13:14.430659 2570 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jul 15 05:13:14.530350 kubelet[2570]: I0715 05:13:14.530327 2570 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 15 05:13:14.530547 kubelet[2570]: E0715 05:13:14.530533 2570 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Jul 15 05:13:14.594347 kubelet[2570]: E0715 05:13:14.594271 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="800ms" Jul 15 05:13:14.732279 kubelet[2570]: I0715 05:13:14.732190 2570 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 15 05:13:14.732552 kubelet[2570]: E0715 05:13:14.732376 2570 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Jul 15 05:13:14.761718 systemd[1]: Created slice kubepods-burstable-pod3f04709fe51ae4ab5abd58e8da771b74.slice - libcontainer container kubepods-burstable-pod3f04709fe51ae4ab5abd58e8da771b74.slice. Jul 15 05:13:14.777712 kubelet[2570]: I0715 05:13:14.777468 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:14.777712 kubelet[2570]: I0715 05:13:14.777508 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:14.777712 kubelet[2570]: I0715 05:13:14.777527 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/45483edb4e9ce298a7c7273c060ac3b7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"45483edb4e9ce298a7c7273c060ac3b7\") " pod="kube-system/kube-apiserver-localhost" Jul 15 05:13:14.777712 kubelet[2570]: I0715 05:13:14.777542 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:14.777712 kubelet[2570]: I0715 05:13:14.777558 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:14.777961 kubelet[2570]: I0715 05:13:14.777572 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/45483edb4e9ce298a7c7273c060ac3b7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"45483edb4e9ce298a7c7273c060ac3b7\") " pod="kube-system/kube-apiserver-localhost" Jul 15 05:13:14.777961 kubelet[2570]: I0715 05:13:14.777588 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/45483edb4e9ce298a7c7273c060ac3b7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"45483edb4e9ce298a7c7273c060ac3b7\") " pod="kube-system/kube-apiserver-localhost" Jul 15 05:13:14.777961 kubelet[2570]: I0715 05:13:14.777606 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:14.777961 kubelet[2570]: I0715 05:13:14.777623 2570 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b35b56493416c25588cb530e37ffc065-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"b35b56493416c25588cb530e37ffc065\") " pod="kube-system/kube-scheduler-localhost" Jul 15 05:13:14.783129 systemd[1]: Created slice kubepods-burstable-podb35b56493416c25588cb530e37ffc065.slice - libcontainer container kubepods-burstable-podb35b56493416c25588cb530e37ffc065.slice. Jul 15 05:13:14.785769 systemd[1]: Created slice kubepods-burstable-pod45483edb4e9ce298a7c7273c060ac3b7.slice - libcontainer container kubepods-burstable-pod45483edb4e9ce298a7c7273c060ac3b7.slice. Jul 15 05:13:14.961573 kubelet[2570]: W0715 05:13:14.961496 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:14.961573 kubelet[2570]: E0715 05:13:14.961551 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:15.081163 containerd[1622]: time="2025-07-15T05:13:15.081136031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:3f04709fe51ae4ab5abd58e8da771b74,Namespace:kube-system,Attempt:0,}" Jul 15 05:13:15.085403 containerd[1622]: time="2025-07-15T05:13:15.085337063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:b35b56493416c25588cb530e37ffc065,Namespace:kube-system,Attempt:0,}" Jul 15 05:13:15.087970 containerd[1622]: time="2025-07-15T05:13:15.087918779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:45483edb4e9ce298a7c7273c060ac3b7,Namespace:kube-system,Attempt:0,}" Jul 15 05:13:15.111565 kubelet[2570]: W0715 05:13:15.111501 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:15.111565 kubelet[2570]: E0715 05:13:15.111549 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:15.133765 kubelet[2570]: I0715 05:13:15.133732 2570 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 15 05:13:15.134193 kubelet[2570]: E0715 05:13:15.134173 2570 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Jul 15 05:13:15.327513 kubelet[2570]: W0715 05:13:15.327414 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:15.327513 kubelet[2570]: E0715 05:13:15.327458 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:15.395429 kubelet[2570]: E0715 05:13:15.395385 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="1.6s" Jul 15 05:13:15.434161 kubelet[2570]: W0715 05:13:15.434114 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:15.434245 kubelet[2570]: E0715 05:13:15.434168 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:15.696460 kubelet[2570]: E0715 05:13:15.696432 2570 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.102:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:15.935935 kubelet[2570]: I0715 05:13:15.935910 2570 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 15 05:13:15.936392 kubelet[2570]: E0715 05:13:15.936098 2570 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Jul 15 05:13:15.938255 containerd[1622]: time="2025-07-15T05:13:15.938212488Z" level=info msg="connecting to shim 3db0642fd790b7634ecb137dd2ae8a967815225c9073449f5cadbc8d4dea0bd1" address="unix:///run/containerd/s/df8afe566741907d22f5ed4a01c08920d8fc8d3358b2d15297a66f5444222c12" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:13:15.939436 containerd[1622]: time="2025-07-15T05:13:15.938961059Z" level=info msg="connecting to shim 5581453df783de054457d3339558b577da473a7e4177342a2abe9b42e1211993" address="unix:///run/containerd/s/a4682f853c34170c37cf467ecbdabd67ec4d7a934b76e605988bfdb8348a09e6" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:13:15.950154 containerd[1622]: time="2025-07-15T05:13:15.949351184Z" level=info msg="connecting to shim c0ca3bcb311c21e4587db65451dcd83a2364402a07dc5ecab180eadb8e7eacd9" address="unix:///run/containerd/s/453135f51fbae41a2a922268d133d16b923e66d001610941c60ac61075072304" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:13:16.225068 systemd[1]: Started cri-containerd-3db0642fd790b7634ecb137dd2ae8a967815225c9073449f5cadbc8d4dea0bd1.scope - libcontainer container 3db0642fd790b7634ecb137dd2ae8a967815225c9073449f5cadbc8d4dea0bd1. Jul 15 05:13:16.226261 systemd[1]: Started cri-containerd-5581453df783de054457d3339558b577da473a7e4177342a2abe9b42e1211993.scope - libcontainer container 5581453df783de054457d3339558b577da473a7e4177342a2abe9b42e1211993. Jul 15 05:13:16.228165 systemd[1]: Started cri-containerd-c0ca3bcb311c21e4587db65451dcd83a2364402a07dc5ecab180eadb8e7eacd9.scope - libcontainer container c0ca3bcb311c21e4587db65451dcd83a2364402a07dc5ecab180eadb8e7eacd9. Jul 15 05:13:16.337682 containerd[1622]: time="2025-07-15T05:13:16.337608809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:45483edb4e9ce298a7c7273c060ac3b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"c0ca3bcb311c21e4587db65451dcd83a2364402a07dc5ecab180eadb8e7eacd9\"" Jul 15 05:13:16.339689 containerd[1622]: time="2025-07-15T05:13:16.339499745Z" level=info msg="CreateContainer within sandbox \"c0ca3bcb311c21e4587db65451dcd83a2364402a07dc5ecab180eadb8e7eacd9\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 15 05:13:16.357402 containerd[1622]: time="2025-07-15T05:13:16.357374066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:3f04709fe51ae4ab5abd58e8da771b74,Namespace:kube-system,Attempt:0,} returns sandbox id \"5581453df783de054457d3339558b577da473a7e4177342a2abe9b42e1211993\"" Jul 15 05:13:16.358708 containerd[1622]: time="2025-07-15T05:13:16.358684015Z" level=info msg="CreateContainer within sandbox \"5581453df783de054457d3339558b577da473a7e4177342a2abe9b42e1211993\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 15 05:13:16.375504 containerd[1622]: time="2025-07-15T05:13:16.375438519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:b35b56493416c25588cb530e37ffc065,Namespace:kube-system,Attempt:0,} returns sandbox id \"3db0642fd790b7634ecb137dd2ae8a967815225c9073449f5cadbc8d4dea0bd1\"" Jul 15 05:13:16.376920 containerd[1622]: time="2025-07-15T05:13:16.376736118Z" level=info msg="CreateContainer within sandbox \"3db0642fd790b7634ecb137dd2ae8a967815225c9073449f5cadbc8d4dea0bd1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 15 05:13:16.420943 containerd[1622]: time="2025-07-15T05:13:16.420656852Z" level=info msg="Container 8897d995110b2b5742a4ca59575ce971abfa528a1ee4b6e36e98422e014411e7: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:16.452034 containerd[1622]: time="2025-07-15T05:13:16.452004186Z" level=info msg="Container 4a1499334f25d19cd66b4bc1d8a16d29e24217efd4c05720facb3feeccd078de: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:16.549392 containerd[1622]: time="2025-07-15T05:13:16.549312319Z" level=info msg="CreateContainer within sandbox \"c0ca3bcb311c21e4587db65451dcd83a2364402a07dc5ecab180eadb8e7eacd9\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"8897d995110b2b5742a4ca59575ce971abfa528a1ee4b6e36e98422e014411e7\"" Jul 15 05:13:16.550208 containerd[1622]: time="2025-07-15T05:13:16.550185431Z" level=info msg="StartContainer for \"8897d995110b2b5742a4ca59575ce971abfa528a1ee4b6e36e98422e014411e7\"" Jul 15 05:13:16.561717 containerd[1622]: time="2025-07-15T05:13:16.561682038Z" level=info msg="connecting to shim 8897d995110b2b5742a4ca59575ce971abfa528a1ee4b6e36e98422e014411e7" address="unix:///run/containerd/s/453135f51fbae41a2a922268d133d16b923e66d001610941c60ac61075072304" protocol=ttrpc version=3 Jul 15 05:13:16.578007 systemd[1]: Started cri-containerd-8897d995110b2b5742a4ca59575ce971abfa528a1ee4b6e36e98422e014411e7.scope - libcontainer container 8897d995110b2b5742a4ca59575ce971abfa528a1ee4b6e36e98422e014411e7. Jul 15 05:13:16.644873 containerd[1622]: time="2025-07-15T05:13:16.644303960Z" level=info msg="Container d1ecd0529b2b02a7ef579743e5bd46be4af8fe701f52bad89797f5333a447e0d: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:16.655661 containerd[1622]: time="2025-07-15T05:13:16.655208258Z" level=info msg="CreateContainer within sandbox \"5581453df783de054457d3339558b577da473a7e4177342a2abe9b42e1211993\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4a1499334f25d19cd66b4bc1d8a16d29e24217efd4c05720facb3feeccd078de\"" Jul 15 05:13:16.655925 containerd[1622]: time="2025-07-15T05:13:16.655909944Z" level=info msg="StartContainer for \"8897d995110b2b5742a4ca59575ce971abfa528a1ee4b6e36e98422e014411e7\" returns successfully" Jul 15 05:13:16.656304 containerd[1622]: time="2025-07-15T05:13:16.656282441Z" level=info msg="StartContainer for \"4a1499334f25d19cd66b4bc1d8a16d29e24217efd4c05720facb3feeccd078de\"" Jul 15 05:13:16.657341 containerd[1622]: time="2025-07-15T05:13:16.657316491Z" level=info msg="connecting to shim 4a1499334f25d19cd66b4bc1d8a16d29e24217efd4c05720facb3feeccd078de" address="unix:///run/containerd/s/a4682f853c34170c37cf467ecbdabd67ec4d7a934b76e605988bfdb8348a09e6" protocol=ttrpc version=3 Jul 15 05:13:16.679007 systemd[1]: Started cri-containerd-4a1499334f25d19cd66b4bc1d8a16d29e24217efd4c05720facb3feeccd078de.scope - libcontainer container 4a1499334f25d19cd66b4bc1d8a16d29e24217efd4c05720facb3feeccd078de. Jul 15 05:13:16.720743 containerd[1622]: time="2025-07-15T05:13:16.720702081Z" level=info msg="StartContainer for \"4a1499334f25d19cd66b4bc1d8a16d29e24217efd4c05720facb3feeccd078de\" returns successfully" Jul 15 05:13:16.747177 containerd[1622]: time="2025-07-15T05:13:16.747151061Z" level=info msg="CreateContainer within sandbox \"3db0642fd790b7634ecb137dd2ae8a967815225c9073449f5cadbc8d4dea0bd1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d1ecd0529b2b02a7ef579743e5bd46be4af8fe701f52bad89797f5333a447e0d\"" Jul 15 05:13:16.747909 containerd[1622]: time="2025-07-15T05:13:16.747495002Z" level=info msg="StartContainer for \"d1ecd0529b2b02a7ef579743e5bd46be4af8fe701f52bad89797f5333a447e0d\"" Jul 15 05:13:16.748969 containerd[1622]: time="2025-07-15T05:13:16.748957354Z" level=info msg="connecting to shim d1ecd0529b2b02a7ef579743e5bd46be4af8fe701f52bad89797f5333a447e0d" address="unix:///run/containerd/s/df8afe566741907d22f5ed4a01c08920d8fc8d3358b2d15297a66f5444222c12" protocol=ttrpc version=3 Jul 15 05:13:16.767107 systemd[1]: Started cri-containerd-d1ecd0529b2b02a7ef579743e5bd46be4af8fe701f52bad89797f5333a447e0d.scope - libcontainer container d1ecd0529b2b02a7ef579743e5bd46be4af8fe701f52bad89797f5333a447e0d. Jul 15 05:13:16.995827 kubelet[2570]: E0715 05:13:16.995796 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="3.2s" Jul 15 05:13:17.005917 containerd[1622]: time="2025-07-15T05:13:17.005882344Z" level=info msg="StartContainer for \"d1ecd0529b2b02a7ef579743e5bd46be4af8fe701f52bad89797f5333a447e0d\" returns successfully" Jul 15 05:13:17.276833 kubelet[2570]: W0715 05:13:17.276747 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:17.276833 kubelet[2570]: E0715 05:13:17.276777 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:17.461065 kubelet[2570]: W0715 05:13:17.461016 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:17.461065 kubelet[2570]: E0715 05:13:17.461048 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:17.537445 kubelet[2570]: I0715 05:13:17.537366 2570 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 15 05:13:17.537903 kubelet[2570]: E0715 05:13:17.537873 2570 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Jul 15 05:13:17.713068 kubelet[2570]: E0715 05:13:17.713002 2570 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.102:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.102:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.185254c13a269c2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-07-15 05:13:13.881574446 +0000 UTC m=+1.215942085,LastTimestamp:2025-07-15 05:13:13.881574446 +0000 UTC m=+1.215942085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jul 15 05:13:18.013819 kubelet[2570]: W0715 05:13:18.013761 2570 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Jul 15 05:13:18.013819 kubelet[2570]: E0715 05:13:18.013815 2570 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:13:19.822636 kubelet[2570]: I0715 05:13:19.822595 2570 apiserver.go:52] "Watching apiserver" Jul 15 05:13:19.866841 kubelet[2570]: I0715 05:13:19.866813 2570 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 15 05:13:20.044312 kubelet[2570]: E0715 05:13:20.044286 2570 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jul 15 05:13:20.198274 kubelet[2570]: E0715 05:13:20.198248 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jul 15 05:13:20.390124 kubelet[2570]: E0715 05:13:20.390100 2570 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jul 15 05:13:20.739336 kubelet[2570]: I0715 05:13:20.739299 2570 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 15 05:13:20.749446 kubelet[2570]: I0715 05:13:20.749340 2570 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Jul 15 05:13:20.749446 kubelet[2570]: E0715 05:13:20.749364 2570 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Jul 15 05:13:21.415445 systemd[1]: Reload requested from client PID 2840 ('systemctl') (unit session-9.scope)... Jul 15 05:13:21.415455 systemd[1]: Reloading... Jul 15 05:13:21.466938 zram_generator::config[2887]: No configuration found. Jul 15 05:13:21.537917 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:13:21.545976 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jul 15 05:13:21.624163 systemd[1]: Reloading finished in 208 ms. Jul 15 05:13:21.643875 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:13:21.658182 systemd[1]: kubelet.service: Deactivated successfully. Jul 15 05:13:21.658366 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:13:21.658403 systemd[1]: kubelet.service: Consumed 529ms CPU time, 128.4M memory peak. Jul 15 05:13:21.660675 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:13:22.189562 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:13:22.195235 (kubelet)[2951]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:13:22.256170 kubelet[2951]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:13:22.256485 kubelet[2951]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 15 05:13:22.256485 kubelet[2951]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:13:22.256604 kubelet[2951]: I0715 05:13:22.256574 2951 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:13:22.263946 kubelet[2951]: I0715 05:13:22.263304 2951 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 15 05:13:22.263946 kubelet[2951]: I0715 05:13:22.263325 2951 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:13:22.263946 kubelet[2951]: I0715 05:13:22.263484 2951 server.go:934] "Client rotation is on, will bootstrap in background" Jul 15 05:13:22.264460 kubelet[2951]: I0715 05:13:22.264450 2951 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 15 05:13:22.272887 kubelet[2951]: I0715 05:13:22.272863 2951 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:13:22.275454 kubelet[2951]: I0715 05:13:22.275422 2951 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:13:22.279638 kubelet[2951]: I0715 05:13:22.279609 2951 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:13:22.279737 kubelet[2951]: I0715 05:13:22.279712 2951 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 15 05:13:22.279828 kubelet[2951]: I0715 05:13:22.279798 2951 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:13:22.280103 kubelet[2951]: I0715 05:13:22.279829 2951 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:13:22.280164 kubelet[2951]: I0715 05:13:22.280110 2951 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:13:22.280164 kubelet[2951]: I0715 05:13:22.280119 2951 container_manager_linux.go:300] "Creating device plugin manager" Jul 15 05:13:22.280164 kubelet[2951]: I0715 05:13:22.280148 2951 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:13:22.280236 kubelet[2951]: I0715 05:13:22.280223 2951 kubelet.go:408] "Attempting to sync node with API server" Jul 15 05:13:22.280236 kubelet[2951]: I0715 05:13:22.280233 2951 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:13:22.280270 kubelet[2951]: I0715 05:13:22.280263 2951 kubelet.go:314] "Adding apiserver pod source" Jul 15 05:13:22.280286 kubelet[2951]: I0715 05:13:22.280272 2951 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:13:22.285233 kubelet[2951]: I0715 05:13:22.285199 2951 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:13:22.287941 kubelet[2951]: I0715 05:13:22.287921 2951 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 05:13:22.288420 kubelet[2951]: I0715 05:13:22.288407 2951 server.go:1274] "Started kubelet" Jul 15 05:13:22.290067 kubelet[2951]: I0715 05:13:22.290013 2951 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:13:22.291865 kubelet[2951]: I0715 05:13:22.291812 2951 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:13:22.293414 kubelet[2951]: I0715 05:13:22.293382 2951 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:13:22.293819 kubelet[2951]: I0715 05:13:22.293611 2951 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:13:22.294952 kubelet[2951]: I0715 05:13:22.294779 2951 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:13:22.296760 kubelet[2951]: I0715 05:13:22.296748 2951 server.go:449] "Adding debug handlers to kubelet server" Jul 15 05:13:22.298826 kubelet[2951]: I0715 05:13:22.298811 2951 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 15 05:13:22.299445 kubelet[2951]: I0715 05:13:22.299431 2951 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 15 05:13:22.299624 kubelet[2951]: I0715 05:13:22.299533 2951 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:13:22.300805 kubelet[2951]: I0715 05:13:22.300507 2951 factory.go:221] Registration of the systemd container factory successfully Jul 15 05:13:22.300805 kubelet[2951]: I0715 05:13:22.300608 2951 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:13:22.302174 kubelet[2951]: E0715 05:13:22.302163 2951 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 05:13:22.304331 kubelet[2951]: I0715 05:13:22.304312 2951 factory.go:221] Registration of the containerd container factory successfully Jul 15 05:13:22.315586 kubelet[2951]: I0715 05:13:22.315559 2951 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 05:13:22.316964 kubelet[2951]: I0715 05:13:22.316939 2951 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 05:13:22.317038 kubelet[2951]: I0715 05:13:22.316953 2951 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 15 05:13:22.317099 kubelet[2951]: I0715 05:13:22.317030 2951 kubelet.go:2321] "Starting kubelet main sync loop" Jul 15 05:13:22.317209 kubelet[2951]: E0715 05:13:22.317142 2951 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:13:22.339533 kubelet[2951]: I0715 05:13:22.339515 2951 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 15 05:13:22.339635 kubelet[2951]: I0715 05:13:22.339526 2951 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 15 05:13:22.339635 kubelet[2951]: I0715 05:13:22.339585 2951 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:13:22.339696 kubelet[2951]: I0715 05:13:22.339685 2951 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 15 05:13:22.339716 kubelet[2951]: I0715 05:13:22.339694 2951 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 15 05:13:22.339716 kubelet[2951]: I0715 05:13:22.339705 2951 policy_none.go:49] "None policy: Start" Jul 15 05:13:22.340023 kubelet[2951]: I0715 05:13:22.340012 2951 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 15 05:13:22.340056 kubelet[2951]: I0715 05:13:22.340025 2951 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:13:22.340164 kubelet[2951]: I0715 05:13:22.340155 2951 state_mem.go:75] "Updated machine memory state" Jul 15 05:13:22.342673 kubelet[2951]: I0715 05:13:22.342661 2951 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 05:13:22.342759 kubelet[2951]: I0715 05:13:22.342750 2951 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:13:22.342786 kubelet[2951]: I0715 05:13:22.342758 2951 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:13:22.343068 kubelet[2951]: I0715 05:13:22.343033 2951 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:13:22.446674 kubelet[2951]: I0715 05:13:22.446602 2951 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 15 05:13:22.452033 kubelet[2951]: I0715 05:13:22.451851 2951 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Jul 15 05:13:22.452033 kubelet[2951]: I0715 05:13:22.451925 2951 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Jul 15 05:13:22.501411 kubelet[2951]: I0715 05:13:22.501274 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:22.501411 kubelet[2951]: I0715 05:13:22.501301 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:22.501411 kubelet[2951]: I0715 05:13:22.501314 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:22.501411 kubelet[2951]: I0715 05:13:22.501328 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b35b56493416c25588cb530e37ffc065-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"b35b56493416c25588cb530e37ffc065\") " pod="kube-system/kube-scheduler-localhost" Jul 15 05:13:22.501411 kubelet[2951]: I0715 05:13:22.501337 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/45483edb4e9ce298a7c7273c060ac3b7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"45483edb4e9ce298a7c7273c060ac3b7\") " pod="kube-system/kube-apiserver-localhost" Jul 15 05:13:22.501640 kubelet[2951]: I0715 05:13:22.501346 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:22.501640 kubelet[2951]: I0715 05:13:22.501355 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 15 05:13:22.501640 kubelet[2951]: I0715 05:13:22.501363 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/45483edb4e9ce298a7c7273c060ac3b7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"45483edb4e9ce298a7c7273c060ac3b7\") " pod="kube-system/kube-apiserver-localhost" Jul 15 05:13:22.501640 kubelet[2951]: I0715 05:13:22.501372 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/45483edb4e9ce298a7c7273c060ac3b7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"45483edb4e9ce298a7c7273c060ac3b7\") " pod="kube-system/kube-apiserver-localhost" Jul 15 05:13:23.281484 kubelet[2951]: I0715 05:13:23.281440 2951 apiserver.go:52] "Watching apiserver" Jul 15 05:13:23.300466 kubelet[2951]: I0715 05:13:23.300439 2951 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 15 05:13:23.333111 kubelet[2951]: E0715 05:13:23.333017 2951 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jul 15 05:13:23.345439 kubelet[2951]: I0715 05:13:23.345348 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.345337111 podStartE2EDuration="1.345337111s" podCreationTimestamp="2025-07-15 05:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:13:23.345278864 +0000 UTC m=+1.124589730" watchObservedRunningTime="2025-07-15 05:13:23.345337111 +0000 UTC m=+1.124647970" Jul 15 05:13:23.351573 kubelet[2951]: I0715 05:13:23.351205 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.3511616659999999 podStartE2EDuration="1.351161666s" podCreationTimestamp="2025-07-15 05:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:13:23.351158669 +0000 UTC m=+1.130469535" watchObservedRunningTime="2025-07-15 05:13:23.351161666 +0000 UTC m=+1.130472526" Jul 15 05:13:23.357254 kubelet[2951]: I0715 05:13:23.357198 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.357184507 podStartE2EDuration="1.357184507s" podCreationTimestamp="2025-07-15 05:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:13:23.356894239 +0000 UTC m=+1.136205106" watchObservedRunningTime="2025-07-15 05:13:23.357184507 +0000 UTC m=+1.136495374" Jul 15 05:13:28.972717 kubelet[2951]: I0715 05:13:28.972682 2951 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 15 05:13:28.973175 containerd[1622]: time="2025-07-15T05:13:28.972889624Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 15 05:13:28.973356 kubelet[2951]: I0715 05:13:28.973201 2951 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 15 05:13:29.705108 systemd[1]: Created slice kubepods-besteffort-pod512ace0e_c3f9_441f_90ac_3da1484796c2.slice - libcontainer container kubepods-besteffort-pod512ace0e_c3f9_441f_90ac_3da1484796c2.slice. Jul 15 05:13:29.749058 kubelet[2951]: I0715 05:13:29.748961 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/512ace0e-c3f9-441f-90ac-3da1484796c2-xtables-lock\") pod \"kube-proxy-md89l\" (UID: \"512ace0e-c3f9-441f-90ac-3da1484796c2\") " pod="kube-system/kube-proxy-md89l" Jul 15 05:13:29.749058 kubelet[2951]: I0715 05:13:29.748984 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/512ace0e-c3f9-441f-90ac-3da1484796c2-lib-modules\") pod \"kube-proxy-md89l\" (UID: \"512ace0e-c3f9-441f-90ac-3da1484796c2\") " pod="kube-system/kube-proxy-md89l" Jul 15 05:13:29.749058 kubelet[2951]: I0715 05:13:29.748997 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/512ace0e-c3f9-441f-90ac-3da1484796c2-kube-proxy\") pod \"kube-proxy-md89l\" (UID: \"512ace0e-c3f9-441f-90ac-3da1484796c2\") " pod="kube-system/kube-proxy-md89l" Jul 15 05:13:29.749058 kubelet[2951]: I0715 05:13:29.749021 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtc6s\" (UniqueName: \"kubernetes.io/projected/512ace0e-c3f9-441f-90ac-3da1484796c2-kube-api-access-qtc6s\") pod \"kube-proxy-md89l\" (UID: \"512ace0e-c3f9-441f-90ac-3da1484796c2\") " pod="kube-system/kube-proxy-md89l" Jul 15 05:13:30.014826 containerd[1622]: time="2025-07-15T05:13:30.014749328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-md89l,Uid:512ace0e-c3f9-441f-90ac-3da1484796c2,Namespace:kube-system,Attempt:0,}" Jul 15 05:13:30.037600 systemd[1]: Created slice kubepods-besteffort-pod3d5ac6dc_10bb_43d3_8daf_cb4deb7a3809.slice - libcontainer container kubepods-besteffort-pod3d5ac6dc_10bb_43d3_8daf_cb4deb7a3809.slice. Jul 15 05:13:30.042173 containerd[1622]: time="2025-07-15T05:13:30.042142757Z" level=info msg="connecting to shim 58e88b262210182ed38d598a1c9b2d4b8d18b62efcc1ecd5054fb3e3592b5a5f" address="unix:///run/containerd/s/1c7a17140ea62ca9afb81cce339747911260d1dd385a33e4646d411b256c84f4" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:13:30.050945 kubelet[2951]: I0715 05:13:30.050890 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3d5ac6dc-10bb-43d3-8daf-cb4deb7a3809-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-q7skw\" (UID: \"3d5ac6dc-10bb-43d3-8daf-cb4deb7a3809\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-q7skw" Jul 15 05:13:30.050945 kubelet[2951]: I0715 05:13:30.050943 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl28n\" (UniqueName: \"kubernetes.io/projected/3d5ac6dc-10bb-43d3-8daf-cb4deb7a3809-kube-api-access-sl28n\") pod \"tigera-operator-5bf8dfcb4-q7skw\" (UID: \"3d5ac6dc-10bb-43d3-8daf-cb4deb7a3809\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-q7skw" Jul 15 05:13:30.062103 systemd[1]: Started cri-containerd-58e88b262210182ed38d598a1c9b2d4b8d18b62efcc1ecd5054fb3e3592b5a5f.scope - libcontainer container 58e88b262210182ed38d598a1c9b2d4b8d18b62efcc1ecd5054fb3e3592b5a5f. Jul 15 05:13:30.080048 containerd[1622]: time="2025-07-15T05:13:30.080021654Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-md89l,Uid:512ace0e-c3f9-441f-90ac-3da1484796c2,Namespace:kube-system,Attempt:0,} returns sandbox id \"58e88b262210182ed38d598a1c9b2d4b8d18b62efcc1ecd5054fb3e3592b5a5f\"" Jul 15 05:13:30.082987 containerd[1622]: time="2025-07-15T05:13:30.082947543Z" level=info msg="CreateContainer within sandbox \"58e88b262210182ed38d598a1c9b2d4b8d18b62efcc1ecd5054fb3e3592b5a5f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 15 05:13:30.090053 containerd[1622]: time="2025-07-15T05:13:30.090028996Z" level=info msg="Container 3435453d3a833a940ea1424eb2fdcb8dd2eef749de1c3647330a922bc73a04ce: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:30.099542 containerd[1622]: time="2025-07-15T05:13:30.099487404Z" level=info msg="CreateContainer within sandbox \"58e88b262210182ed38d598a1c9b2d4b8d18b62efcc1ecd5054fb3e3592b5a5f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"3435453d3a833a940ea1424eb2fdcb8dd2eef749de1c3647330a922bc73a04ce\"" Jul 15 05:13:30.100053 containerd[1622]: time="2025-07-15T05:13:30.099987581Z" level=info msg="StartContainer for \"3435453d3a833a940ea1424eb2fdcb8dd2eef749de1c3647330a922bc73a04ce\"" Jul 15 05:13:30.101046 containerd[1622]: time="2025-07-15T05:13:30.101018708Z" level=info msg="connecting to shim 3435453d3a833a940ea1424eb2fdcb8dd2eef749de1c3647330a922bc73a04ce" address="unix:///run/containerd/s/1c7a17140ea62ca9afb81cce339747911260d1dd385a33e4646d411b256c84f4" protocol=ttrpc version=3 Jul 15 05:13:30.118075 systemd[1]: Started cri-containerd-3435453d3a833a940ea1424eb2fdcb8dd2eef749de1c3647330a922bc73a04ce.scope - libcontainer container 3435453d3a833a940ea1424eb2fdcb8dd2eef749de1c3647330a922bc73a04ce. Jul 15 05:13:30.144621 containerd[1622]: time="2025-07-15T05:13:30.144588027Z" level=info msg="StartContainer for \"3435453d3a833a940ea1424eb2fdcb8dd2eef749de1c3647330a922bc73a04ce\" returns successfully" Jul 15 05:13:30.343490 containerd[1622]: time="2025-07-15T05:13:30.342347944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-q7skw,Uid:3d5ac6dc-10bb-43d3-8daf-cb4deb7a3809,Namespace:tigera-operator,Attempt:0,}" Jul 15 05:13:30.360154 containerd[1622]: time="2025-07-15T05:13:30.360122462Z" level=info msg="connecting to shim 52619db1c8baf441c08cd91f69ebf9136774b80cf9d57dcaae83b4244ad758bf" address="unix:///run/containerd/s/3ff63571ac99baeacb1a6c45d4478c9a8e66a5f6df706868302fb49d6ccb3ae7" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:13:30.383037 systemd[1]: Started cri-containerd-52619db1c8baf441c08cd91f69ebf9136774b80cf9d57dcaae83b4244ad758bf.scope - libcontainer container 52619db1c8baf441c08cd91f69ebf9136774b80cf9d57dcaae83b4244ad758bf. Jul 15 05:13:30.423942 containerd[1622]: time="2025-07-15T05:13:30.423833783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-q7skw,Uid:3d5ac6dc-10bb-43d3-8daf-cb4deb7a3809,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"52619db1c8baf441c08cd91f69ebf9136774b80cf9d57dcaae83b4244ad758bf\"" Jul 15 05:13:30.425097 containerd[1622]: time="2025-07-15T05:13:30.425045359Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 15 05:13:30.988279 kubelet[2951]: I0715 05:13:30.988099 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-md89l" podStartSLOduration=1.988087027 podStartE2EDuration="1.988087027s" podCreationTimestamp="2025-07-15 05:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:13:30.348982655 +0000 UTC m=+8.128293521" watchObservedRunningTime="2025-07-15 05:13:30.988087027 +0000 UTC m=+8.767397884" Jul 15 05:13:31.957029 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount543519170.mount: Deactivated successfully. Jul 15 05:13:34.267682 containerd[1622]: time="2025-07-15T05:13:34.267586434Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:34.268116 containerd[1622]: time="2025-07-15T05:13:34.268045868Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Jul 15 05:13:34.269127 containerd[1622]: time="2025-07-15T05:13:34.268669508Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:34.270272 containerd[1622]: time="2025-07-15T05:13:34.270247132Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:34.270509 containerd[1622]: time="2025-07-15T05:13:34.270492273Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 3.845428782s" Jul 15 05:13:34.270539 containerd[1622]: time="2025-07-15T05:13:34.270509883Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Jul 15 05:13:34.271853 containerd[1622]: time="2025-07-15T05:13:34.271833150Z" level=info msg="CreateContainer within sandbox \"52619db1c8baf441c08cd91f69ebf9136774b80cf9d57dcaae83b4244ad758bf\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 15 05:13:34.277351 containerd[1622]: time="2025-07-15T05:13:34.277322311Z" level=info msg="Container 896eaedbe145eb9672eedd3766f8f45f4160248a8aa2ec34ec708d08bbcd0d0e: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:34.279064 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3231786702.mount: Deactivated successfully. Jul 15 05:13:34.281793 containerd[1622]: time="2025-07-15T05:13:34.281773057Z" level=info msg="CreateContainer within sandbox \"52619db1c8baf441c08cd91f69ebf9136774b80cf9d57dcaae83b4244ad758bf\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"896eaedbe145eb9672eedd3766f8f45f4160248a8aa2ec34ec708d08bbcd0d0e\"" Jul 15 05:13:34.282163 containerd[1622]: time="2025-07-15T05:13:34.282117426Z" level=info msg="StartContainer for \"896eaedbe145eb9672eedd3766f8f45f4160248a8aa2ec34ec708d08bbcd0d0e\"" Jul 15 05:13:34.283369 containerd[1622]: time="2025-07-15T05:13:34.283350740Z" level=info msg="connecting to shim 896eaedbe145eb9672eedd3766f8f45f4160248a8aa2ec34ec708d08bbcd0d0e" address="unix:///run/containerd/s/3ff63571ac99baeacb1a6c45d4478c9a8e66a5f6df706868302fb49d6ccb3ae7" protocol=ttrpc version=3 Jul 15 05:13:34.302157 systemd[1]: Started cri-containerd-896eaedbe145eb9672eedd3766f8f45f4160248a8aa2ec34ec708d08bbcd0d0e.scope - libcontainer container 896eaedbe145eb9672eedd3766f8f45f4160248a8aa2ec34ec708d08bbcd0d0e. Jul 15 05:13:34.321979 containerd[1622]: time="2025-07-15T05:13:34.321954585Z" level=info msg="StartContainer for \"896eaedbe145eb9672eedd3766f8f45f4160248a8aa2ec34ec708d08bbcd0d0e\" returns successfully" Jul 15 05:13:34.351322 kubelet[2951]: I0715 05:13:34.351064 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-q7skw" podStartSLOduration=1.504722799 podStartE2EDuration="5.351052734s" podCreationTimestamp="2025-07-15 05:13:29 +0000 UTC" firstStartedPulling="2025-07-15 05:13:30.424694794 +0000 UTC m=+8.204005651" lastFinishedPulling="2025-07-15 05:13:34.271024726 +0000 UTC m=+12.050335586" observedRunningTime="2025-07-15 05:13:34.350984983 +0000 UTC m=+12.130295845" watchObservedRunningTime="2025-07-15 05:13:34.351052734 +0000 UTC m=+12.130363595" Jul 15 05:13:40.017071 sudo[1976]: pam_unix(sudo:session): session closed for user root Jul 15 05:13:40.017991 sshd[1975]: Connection closed by 147.75.109.163 port 51198 Jul 15 05:13:40.018983 sshd-session[1972]: pam_unix(sshd:session): session closed for user core Jul 15 05:13:40.021748 systemd[1]: sshd@6-139.178.70.102:22-147.75.109.163:51198.service: Deactivated successfully. Jul 15 05:13:40.025519 systemd[1]: session-9.scope: Deactivated successfully. Jul 15 05:13:40.025804 systemd[1]: session-9.scope: Consumed 2.964s CPU time, 161M memory peak. Jul 15 05:13:40.029217 systemd-logind[1606]: Session 9 logged out. Waiting for processes to exit. Jul 15 05:13:40.030733 systemd-logind[1606]: Removed session 9. Jul 15 05:13:42.325247 systemd[1]: Created slice kubepods-besteffort-pode8556848_b152_4133_b1b2_152f1dec6f0a.slice - libcontainer container kubepods-besteffort-pode8556848_b152_4133_b1b2_152f1dec6f0a.slice. Jul 15 05:13:42.553801 kubelet[2951]: I0715 05:13:42.439596 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qf4\" (UniqueName: \"kubernetes.io/projected/e8556848-b152-4133-b1b2-152f1dec6f0a-kube-api-access-g5qf4\") pod \"calico-typha-6b8f5dd9d5-lwvnl\" (UID: \"e8556848-b152-4133-b1b2-152f1dec6f0a\") " pod="calico-system/calico-typha-6b8f5dd9d5-lwvnl" Jul 15 05:13:42.553801 kubelet[2951]: I0715 05:13:42.439623 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e8556848-b152-4133-b1b2-152f1dec6f0a-typha-certs\") pod \"calico-typha-6b8f5dd9d5-lwvnl\" (UID: \"e8556848-b152-4133-b1b2-152f1dec6f0a\") " pod="calico-system/calico-typha-6b8f5dd9d5-lwvnl" Jul 15 05:13:42.553801 kubelet[2951]: I0715 05:13:42.439637 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8556848-b152-4133-b1b2-152f1dec6f0a-tigera-ca-bundle\") pod \"calico-typha-6b8f5dd9d5-lwvnl\" (UID: \"e8556848-b152-4133-b1b2-152f1dec6f0a\") " pod="calico-system/calico-typha-6b8f5dd9d5-lwvnl" Jul 15 05:13:42.656885 containerd[1622]: time="2025-07-15T05:13:42.656849351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b8f5dd9d5-lwvnl,Uid:e8556848-b152-4133-b1b2-152f1dec6f0a,Namespace:calico-system,Attempt:0,}" Jul 15 05:13:42.677920 systemd[1]: Created slice kubepods-besteffort-pod46a67460_a7f8_42b1_baa2_f8110de2b839.slice - libcontainer container kubepods-besteffort-pod46a67460_a7f8_42b1_baa2_f8110de2b839.slice. Jul 15 05:13:42.839832 containerd[1622]: time="2025-07-15T05:13:42.839519377Z" level=info msg="connecting to shim b5f08039c7c96ede31d0cf39849f5b01337a6c74381a69e7a9a1691840c18ecf" address="unix:///run/containerd/s/498682c39277bf1dd684837ed2e2f9805857213e76a1b58776b012b2443243b8" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:13:42.842966 kubelet[2951]: I0715 05:13:42.842948 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-flexvol-driver-host\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843472 kubelet[2951]: I0715 05:13:42.843460 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-lib-modules\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843532 kubelet[2951]: I0715 05:13:42.843525 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-xtables-lock\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843579 kubelet[2951]: I0715 05:13:42.843571 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/46a67460-a7f8-42b1-baa2-f8110de2b839-node-certs\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843631 kubelet[2951]: I0715 05:13:42.843624 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-cni-bin-dir\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843674 kubelet[2951]: I0715 05:13:42.843668 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-cni-net-dir\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843732 kubelet[2951]: I0715 05:13:42.843725 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-var-run-calico\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843775 kubelet[2951]: I0715 05:13:42.843769 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46a67460-a7f8-42b1-baa2-f8110de2b839-tigera-ca-bundle\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843836 kubelet[2951]: I0715 05:13:42.843823 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-var-lib-calico\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843927 kubelet[2951]: I0715 05:13:42.843870 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxg8g\" (UniqueName: \"kubernetes.io/projected/46a67460-a7f8-42b1-baa2-f8110de2b839-kube-api-access-nxg8g\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.843991 kubelet[2951]: I0715 05:13:42.843982 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-policysync\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.844084 kubelet[2951]: I0715 05:13:42.844067 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/46a67460-a7f8-42b1-baa2-f8110de2b839-cni-log-dir\") pod \"calico-node-dpcjw\" (UID: \"46a67460-a7f8-42b1-baa2-f8110de2b839\") " pod="calico-system/calico-node-dpcjw" Jul 15 05:13:42.873356 systemd[1]: Started cri-containerd-b5f08039c7c96ede31d0cf39849f5b01337a6c74381a69e7a9a1691840c18ecf.scope - libcontainer container b5f08039c7c96ede31d0cf39849f5b01337a6c74381a69e7a9a1691840c18ecf. Jul 15 05:13:42.927798 kubelet[2951]: E0715 05:13:42.927324 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5cx6x" podUID="42f78827-b3e3-4b25-9464-bcf35efc21f2" Jul 15 05:13:42.942244 containerd[1622]: time="2025-07-15T05:13:42.942213200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b8f5dd9d5-lwvnl,Uid:e8556848-b152-4133-b1b2-152f1dec6f0a,Namespace:calico-system,Attempt:0,} returns sandbox id \"b5f08039c7c96ede31d0cf39849f5b01337a6c74381a69e7a9a1691840c18ecf\"" Jul 15 05:13:42.947638 containerd[1622]: time="2025-07-15T05:13:42.946683626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 15 05:13:42.953197 kubelet[2951]: E0715 05:13:42.953173 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.953197 kubelet[2951]: W0715 05:13:42.953191 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.953920 kubelet[2951]: E0715 05:13:42.953682 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.954003 kubelet[2951]: E0715 05:13:42.953994 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.954050 kubelet[2951]: W0715 05:13:42.954042 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.955173 kubelet[2951]: E0715 05:13:42.954968 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.955269 kubelet[2951]: E0715 05:13:42.955261 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.955307 kubelet[2951]: W0715 05:13:42.955300 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.955405 kubelet[2951]: E0715 05:13:42.955336 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.955493 kubelet[2951]: E0715 05:13:42.955487 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.955536 kubelet[2951]: W0715 05:13:42.955529 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.955627 kubelet[2951]: E0715 05:13:42.955579 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.955696 kubelet[2951]: E0715 05:13:42.955690 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.955766 kubelet[2951]: W0715 05:13:42.955724 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.955766 kubelet[2951]: E0715 05:13:42.955731 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.955903 kubelet[2951]: E0715 05:13:42.955857 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.955903 kubelet[2951]: W0715 05:13:42.955864 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.955903 kubelet[2951]: E0715 05:13:42.955872 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.956140 kubelet[2951]: E0715 05:13:42.956084 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.956140 kubelet[2951]: W0715 05:13:42.956092 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.956140 kubelet[2951]: E0715 05:13:42.956101 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.957659 kubelet[2951]: E0715 05:13:42.957565 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.957659 kubelet[2951]: W0715 05:13:42.957579 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.957659 kubelet[2951]: E0715 05:13:42.957593 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.958764 kubelet[2951]: E0715 05:13:42.957963 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.958764 kubelet[2951]: W0715 05:13:42.957972 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.958764 kubelet[2951]: E0715 05:13:42.957979 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.959029 kubelet[2951]: E0715 05:13:42.958938 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.959029 kubelet[2951]: W0715 05:13:42.958947 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.959029 kubelet[2951]: E0715 05:13:42.958956 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.959183 kubelet[2951]: E0715 05:13:42.959129 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.959183 kubelet[2951]: W0715 05:13:42.959137 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.959183 kubelet[2951]: E0715 05:13:42.959144 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.959334 kubelet[2951]: E0715 05:13:42.959274 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.959334 kubelet[2951]: W0715 05:13:42.959283 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.959334 kubelet[2951]: E0715 05:13:42.959294 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.959452 kubelet[2951]: E0715 05:13:42.959445 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.959488 kubelet[2951]: W0715 05:13:42.959480 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.959528 kubelet[2951]: E0715 05:13:42.959522 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.960366 kubelet[2951]: E0715 05:13:42.960358 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.960433 kubelet[2951]: W0715 05:13:42.960426 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.960475 kubelet[2951]: E0715 05:13:42.960467 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.960660 kubelet[2951]: E0715 05:13:42.960652 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.960768 kubelet[2951]: W0715 05:13:42.960760 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.960826 kubelet[2951]: E0715 05:13:42.960812 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.961014 kubelet[2951]: E0715 05:13:42.961005 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.961272 kubelet[2951]: W0715 05:13:42.961230 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.961272 kubelet[2951]: E0715 05:13:42.961246 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.961437 kubelet[2951]: E0715 05:13:42.961427 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.961437 kubelet[2951]: W0715 05:13:42.961435 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.961485 kubelet[2951]: E0715 05:13:42.961442 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.961640 kubelet[2951]: E0715 05:13:42.961630 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.961640 kubelet[2951]: W0715 05:13:42.961637 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.961699 kubelet[2951]: E0715 05:13:42.961643 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.962653 kubelet[2951]: E0715 05:13:42.962028 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.962653 kubelet[2951]: W0715 05:13:42.962036 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.962653 kubelet[2951]: E0715 05:13:42.962042 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.962653 kubelet[2951]: E0715 05:13:42.962275 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.962653 kubelet[2951]: W0715 05:13:42.962283 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.962653 kubelet[2951]: E0715 05:13:42.962292 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.962653 kubelet[2951]: E0715 05:13:42.962407 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.962653 kubelet[2951]: W0715 05:13:42.962412 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.962653 kubelet[2951]: E0715 05:13:42.962418 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.962653 kubelet[2951]: E0715 05:13:42.962495 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.963161 kubelet[2951]: W0715 05:13:42.962500 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.963161 kubelet[2951]: E0715 05:13:42.962505 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.963161 kubelet[2951]: E0715 05:13:42.962578 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:42.963161 kubelet[2951]: W0715 05:13:42.962583 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:42.963161 kubelet[2951]: E0715 05:13:42.962588 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:42.981773 containerd[1622]: time="2025-07-15T05:13:42.981735266Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dpcjw,Uid:46a67460-a7f8-42b1-baa2-f8110de2b839,Namespace:calico-system,Attempt:0,}" Jul 15 05:13:43.005376 containerd[1622]: time="2025-07-15T05:13:43.005333286Z" level=info msg="connecting to shim 86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0" address="unix:///run/containerd/s/287f262e2d491e35f5e22f03e573d09adadf2521689a01692659c19a28d97c8b" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:13:43.037129 systemd[1]: Started cri-containerd-86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0.scope - libcontainer container 86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0. Jul 15 05:13:43.045702 kubelet[2951]: E0715 05:13:43.045653 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.045702 kubelet[2951]: W0715 05:13:43.045667 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.045702 kubelet[2951]: E0715 05:13:43.045682 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.045882 kubelet[2951]: I0715 05:13:43.045827 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42f78827-b3e3-4b25-9464-bcf35efc21f2-kubelet-dir\") pod \"csi-node-driver-5cx6x\" (UID: \"42f78827-b3e3-4b25-9464-bcf35efc21f2\") " pod="calico-system/csi-node-driver-5cx6x" Jul 15 05:13:43.045983 kubelet[2951]: E0715 05:13:43.045960 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.045983 kubelet[2951]: W0715 05:13:43.045966 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.046066 kubelet[2951]: E0715 05:13:43.046031 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.046066 kubelet[2951]: I0715 05:13:43.046044 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42f78827-b3e3-4b25-9464-bcf35efc21f2-registration-dir\") pod \"csi-node-driver-5cx6x\" (UID: \"42f78827-b3e3-4b25-9464-bcf35efc21f2\") " pod="calico-system/csi-node-driver-5cx6x" Jul 15 05:13:43.046388 kubelet[2951]: E0715 05:13:43.046327 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.046388 kubelet[2951]: W0715 05:13:43.046335 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.046388 kubelet[2951]: E0715 05:13:43.046348 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.046388 kubelet[2951]: I0715 05:13:43.046360 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42f78827-b3e3-4b25-9464-bcf35efc21f2-socket-dir\") pod \"csi-node-driver-5cx6x\" (UID: \"42f78827-b3e3-4b25-9464-bcf35efc21f2\") " pod="calico-system/csi-node-driver-5cx6x" Jul 15 05:13:43.046596 kubelet[2951]: E0715 05:13:43.046590 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.046678 kubelet[2951]: W0715 05:13:43.046632 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.046678 kubelet[2951]: E0715 05:13:43.046643 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.046678 kubelet[2951]: I0715 05:13:43.046653 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p24g\" (UniqueName: \"kubernetes.io/projected/42f78827-b3e3-4b25-9464-bcf35efc21f2-kube-api-access-2p24g\") pod \"csi-node-driver-5cx6x\" (UID: \"42f78827-b3e3-4b25-9464-bcf35efc21f2\") " pod="calico-system/csi-node-driver-5cx6x" Jul 15 05:13:43.047054 kubelet[2951]: E0715 05:13:43.046947 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.047054 kubelet[2951]: W0715 05:13:43.046955 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.047054 kubelet[2951]: E0715 05:13:43.046970 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.047054 kubelet[2951]: I0715 05:13:43.046982 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/42f78827-b3e3-4b25-9464-bcf35efc21f2-varrun\") pod \"csi-node-driver-5cx6x\" (UID: \"42f78827-b3e3-4b25-9464-bcf35efc21f2\") " pod="calico-system/csi-node-driver-5cx6x" Jul 15 05:13:43.047222 kubelet[2951]: E0715 05:13:43.047162 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.047222 kubelet[2951]: W0715 05:13:43.047169 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.047307 kubelet[2951]: E0715 05:13:43.047298 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.047375 kubelet[2951]: E0715 05:13:43.047361 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.047422 kubelet[2951]: W0715 05:13:43.047408 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.047570 kubelet[2951]: E0715 05:13:43.047523 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.048183 kubelet[2951]: E0715 05:13:43.048103 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.048183 kubelet[2951]: W0715 05:13:43.048112 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.048183 kubelet[2951]: E0715 05:13:43.048169 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.048358 kubelet[2951]: E0715 05:13:43.048302 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.048358 kubelet[2951]: W0715 05:13:43.048309 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.048459 kubelet[2951]: E0715 05:13:43.048425 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.048459 kubelet[2951]: E0715 05:13:43.048448 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.048459 kubelet[2951]: W0715 05:13:43.048452 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.048664 kubelet[2951]: E0715 05:13:43.048653 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.048771 kubelet[2951]: E0715 05:13:43.048707 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.048771 kubelet[2951]: W0715 05:13:43.048713 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.048771 kubelet[2951]: E0715 05:13:43.048719 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.049066 kubelet[2951]: E0715 05:13:43.048880 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.049066 kubelet[2951]: W0715 05:13:43.048886 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.049066 kubelet[2951]: E0715 05:13:43.048892 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.049307 kubelet[2951]: E0715 05:13:43.049280 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.049307 kubelet[2951]: W0715 05:13:43.049289 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.049307 kubelet[2951]: E0715 05:13:43.049297 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.049874 kubelet[2951]: E0715 05:13:43.049851 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.049874 kubelet[2951]: W0715 05:13:43.049858 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.049874 kubelet[2951]: E0715 05:13:43.049865 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.050168 kubelet[2951]: E0715 05:13:43.050138 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.050168 kubelet[2951]: W0715 05:13:43.050146 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.050168 kubelet[2951]: E0715 05:13:43.050154 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.059807 containerd[1622]: time="2025-07-15T05:13:43.059685689Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dpcjw,Uid:46a67460-a7f8-42b1-baa2-f8110de2b839,Namespace:calico-system,Attempt:0,} returns sandbox id \"86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0\"" Jul 15 05:13:43.147855 kubelet[2951]: E0715 05:13:43.147799 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.147855 kubelet[2951]: W0715 05:13:43.147814 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.147855 kubelet[2951]: E0715 05:13:43.147827 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.148081 kubelet[2951]: E0715 05:13:43.147931 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.148081 kubelet[2951]: W0715 05:13:43.147937 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.148081 kubelet[2951]: E0715 05:13:43.147944 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.148249 kubelet[2951]: E0715 05:13:43.148217 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.148249 kubelet[2951]: W0715 05:13:43.148225 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.148249 kubelet[2951]: E0715 05:13:43.148237 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.148369 kubelet[2951]: E0715 05:13:43.148303 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.148369 kubelet[2951]: W0715 05:13:43.148308 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.148369 kubelet[2951]: E0715 05:13:43.148316 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.148514 kubelet[2951]: E0715 05:13:43.148478 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.148514 kubelet[2951]: W0715 05:13:43.148484 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.148514 kubelet[2951]: E0715 05:13:43.148495 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.148659 kubelet[2951]: E0715 05:13:43.148654 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.148708 kubelet[2951]: W0715 05:13:43.148687 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.148739 kubelet[2951]: E0715 05:13:43.148734 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.148849 kubelet[2951]: E0715 05:13:43.148844 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.148884 kubelet[2951]: W0715 05:13:43.148878 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.148955 kubelet[2951]: E0715 05:13:43.148918 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.149123 kubelet[2951]: E0715 05:13:43.149066 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.149123 kubelet[2951]: W0715 05:13:43.149072 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.149123 kubelet[2951]: E0715 05:13:43.149081 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.149274 kubelet[2951]: E0715 05:13:43.149268 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.149314 kubelet[2951]: W0715 05:13:43.149308 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.149363 kubelet[2951]: E0715 05:13:43.149357 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.149455 kubelet[2951]: E0715 05:13:43.149445 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.149455 kubelet[2951]: W0715 05:13:43.149454 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.149517 kubelet[2951]: E0715 05:13:43.149464 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.149602 kubelet[2951]: E0715 05:13:43.149554 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.149602 kubelet[2951]: W0715 05:13:43.149563 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.149602 kubelet[2951]: E0715 05:13:43.149578 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.149694 kubelet[2951]: E0715 05:13:43.149631 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.149694 kubelet[2951]: W0715 05:13:43.149636 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.149694 kubelet[2951]: E0715 05:13:43.149655 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.149813 kubelet[2951]: E0715 05:13:43.149697 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.149813 kubelet[2951]: W0715 05:13:43.149701 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.149813 kubelet[2951]: E0715 05:13:43.149710 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.149952 kubelet[2951]: E0715 05:13:43.149890 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.149952 kubelet[2951]: W0715 05:13:43.149920 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.149952 kubelet[2951]: E0715 05:13:43.149932 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150069 kubelet[2951]: E0715 05:13:43.150004 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150069 kubelet[2951]: W0715 05:13:43.150009 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150069 kubelet[2951]: E0715 05:13:43.150017 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150179 kubelet[2951]: E0715 05:13:43.150080 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150179 kubelet[2951]: W0715 05:13:43.150084 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150179 kubelet[2951]: E0715 05:13:43.150091 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150179 kubelet[2951]: E0715 05:13:43.150160 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150179 kubelet[2951]: W0715 05:13:43.150164 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150179 kubelet[2951]: E0715 05:13:43.150172 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150339 kubelet[2951]: E0715 05:13:43.150255 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150339 kubelet[2951]: W0715 05:13:43.150259 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150339 kubelet[2951]: E0715 05:13:43.150266 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150339 kubelet[2951]: E0715 05:13:43.150335 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150339 kubelet[2951]: W0715 05:13:43.150340 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150494 kubelet[2951]: E0715 05:13:43.150347 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150494 kubelet[2951]: E0715 05:13:43.150408 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150494 kubelet[2951]: W0715 05:13:43.150412 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150494 kubelet[2951]: E0715 05:13:43.150419 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150693 kubelet[2951]: E0715 05:13:43.150637 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150693 kubelet[2951]: W0715 05:13:43.150644 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150693 kubelet[2951]: E0715 05:13:43.150651 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150809 kubelet[2951]: E0715 05:13:43.150780 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150809 kubelet[2951]: W0715 05:13:43.150786 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150809 kubelet[2951]: E0715 05:13:43.150791 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.150987 kubelet[2951]: E0715 05:13:43.150951 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.150987 kubelet[2951]: W0715 05:13:43.150957 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.150987 kubelet[2951]: E0715 05:13:43.150962 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.151205 kubelet[2951]: E0715 05:13:43.151143 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.151205 kubelet[2951]: W0715 05:13:43.151149 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.151205 kubelet[2951]: E0715 05:13:43.151154 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.156407 kubelet[2951]: E0715 05:13:43.156388 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.156407 kubelet[2951]: W0715 05:13:43.156402 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.156505 kubelet[2951]: E0715 05:13:43.156415 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:43.156535 kubelet[2951]: E0715 05:13:43.156524 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:43.156535 kubelet[2951]: W0715 05:13:43.156531 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:43.156574 kubelet[2951]: E0715 05:13:43.156536 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:44.253606 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2938861505.mount: Deactivated successfully. Jul 15 05:13:44.866727 containerd[1622]: time="2025-07-15T05:13:44.866652791Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:44.867050 containerd[1622]: time="2025-07-15T05:13:44.867029268Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Jul 15 05:13:44.867480 containerd[1622]: time="2025-07-15T05:13:44.867466412Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:44.868550 containerd[1622]: time="2025-07-15T05:13:44.868536960Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:44.869005 containerd[1622]: time="2025-07-15T05:13:44.868987251Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 1.922269088s" Jul 15 05:13:44.869047 containerd[1622]: time="2025-07-15T05:13:44.869005680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Jul 15 05:13:44.869830 containerd[1622]: time="2025-07-15T05:13:44.869817391Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 15 05:13:44.879104 containerd[1622]: time="2025-07-15T05:13:44.878233362Z" level=info msg="CreateContainer within sandbox \"b5f08039c7c96ede31d0cf39849f5b01337a6c74381a69e7a9a1691840c18ecf\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 15 05:13:44.892028 containerd[1622]: time="2025-07-15T05:13:44.891999130Z" level=info msg="Container e5816bfe24617d9021a19c4a4742e1a69a14fe95fea1c43740f699e81578a3f6: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:44.893622 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3878015049.mount: Deactivated successfully. Jul 15 05:13:44.897440 containerd[1622]: time="2025-07-15T05:13:44.897364744Z" level=info msg="CreateContainer within sandbox \"b5f08039c7c96ede31d0cf39849f5b01337a6c74381a69e7a9a1691840c18ecf\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e5816bfe24617d9021a19c4a4742e1a69a14fe95fea1c43740f699e81578a3f6\"" Jul 15 05:13:44.897943 containerd[1622]: time="2025-07-15T05:13:44.897925960Z" level=info msg="StartContainer for \"e5816bfe24617d9021a19c4a4742e1a69a14fe95fea1c43740f699e81578a3f6\"" Jul 15 05:13:44.899190 containerd[1622]: time="2025-07-15T05:13:44.898587901Z" level=info msg="connecting to shim e5816bfe24617d9021a19c4a4742e1a69a14fe95fea1c43740f699e81578a3f6" address="unix:///run/containerd/s/498682c39277bf1dd684837ed2e2f9805857213e76a1b58776b012b2443243b8" protocol=ttrpc version=3 Jul 15 05:13:44.921032 systemd[1]: Started cri-containerd-e5816bfe24617d9021a19c4a4742e1a69a14fe95fea1c43740f699e81578a3f6.scope - libcontainer container e5816bfe24617d9021a19c4a4742e1a69a14fe95fea1c43740f699e81578a3f6. Jul 15 05:13:44.975658 containerd[1622]: time="2025-07-15T05:13:44.975539005Z" level=info msg="StartContainer for \"e5816bfe24617d9021a19c4a4742e1a69a14fe95fea1c43740f699e81578a3f6\" returns successfully" Jul 15 05:13:45.336920 kubelet[2951]: E0715 05:13:45.336865 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5cx6x" podUID="42f78827-b3e3-4b25-9464-bcf35efc21f2" Jul 15 05:13:45.504385 kubelet[2951]: E0715 05:13:45.504347 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.504385 kubelet[2951]: W0715 05:13:45.504381 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.504513 kubelet[2951]: E0715 05:13:45.504413 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.504604 kubelet[2951]: E0715 05:13:45.504591 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.504604 kubelet[2951]: W0715 05:13:45.504600 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.504653 kubelet[2951]: E0715 05:13:45.504605 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.504689 kubelet[2951]: E0715 05:13:45.504681 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.504689 kubelet[2951]: W0715 05:13:45.504686 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.504736 kubelet[2951]: E0715 05:13:45.504691 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.504766 kubelet[2951]: E0715 05:13:45.504760 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.504766 kubelet[2951]: W0715 05:13:45.504765 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.504808 kubelet[2951]: E0715 05:13:45.504770 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.504848 kubelet[2951]: E0715 05:13:45.504843 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.504848 kubelet[2951]: W0715 05:13:45.504847 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.504889 kubelet[2951]: E0715 05:13:45.504852 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.504935 kubelet[2951]: E0715 05:13:45.504925 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.504935 kubelet[2951]: W0715 05:13:45.504933 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.504987 kubelet[2951]: E0715 05:13:45.504938 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.505014 kubelet[2951]: E0715 05:13:45.505002 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.505014 kubelet[2951]: W0715 05:13:45.505006 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.505014 kubelet[2951]: E0715 05:13:45.505011 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.505082 kubelet[2951]: E0715 05:13:45.505076 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.505082 kubelet[2951]: W0715 05:13:45.505080 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.505133 kubelet[2951]: E0715 05:13:45.505085 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.505166 kubelet[2951]: E0715 05:13:45.505162 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.505190 kubelet[2951]: W0715 05:13:45.505166 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.505190 kubelet[2951]: E0715 05:13:45.505171 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.505241 kubelet[2951]: E0715 05:13:45.505234 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.505241 kubelet[2951]: W0715 05:13:45.505239 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.505283 kubelet[2951]: E0715 05:13:45.505244 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.505313 kubelet[2951]: E0715 05:13:45.505304 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.505313 kubelet[2951]: W0715 05:13:45.505310 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.505364 kubelet[2951]: E0715 05:13:45.505315 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.505389 kubelet[2951]: E0715 05:13:45.505380 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.505389 kubelet[2951]: W0715 05:13:45.505384 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.505433 kubelet[2951]: E0715 05:13:45.505388 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.505460 kubelet[2951]: E0715 05:13:45.505457 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.516310 kubelet[2951]: W0715 05:13:45.505461 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.516310 kubelet[2951]: E0715 05:13:45.505466 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.516310 kubelet[2951]: E0715 05:13:45.505530 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.516310 kubelet[2951]: W0715 05:13:45.505535 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.516310 kubelet[2951]: E0715 05:13:45.505540 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.516310 kubelet[2951]: E0715 05:13:45.505607 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.516310 kubelet[2951]: W0715 05:13:45.505612 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.516310 kubelet[2951]: E0715 05:13:45.505616 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.527431 kubelet[2951]: I0715 05:13:45.527073 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b8f5dd9d5-lwvnl" podStartSLOduration=1.600801366 podStartE2EDuration="3.527061573s" podCreationTimestamp="2025-07-15 05:13:42 +0000 UTC" firstStartedPulling="2025-07-15 05:13:42.943280873 +0000 UTC m=+20.722591730" lastFinishedPulling="2025-07-15 05:13:44.869541078 +0000 UTC m=+22.648851937" observedRunningTime="2025-07-15 05:13:45.526650508 +0000 UTC m=+23.305961374" watchObservedRunningTime="2025-07-15 05:13:45.527061573 +0000 UTC m=+23.306372435" Jul 15 05:13:45.600405 kubelet[2951]: E0715 05:13:45.599630 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.600405 kubelet[2951]: W0715 05:13:45.599651 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.600405 kubelet[2951]: E0715 05:13:45.599668 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.600405 kubelet[2951]: E0715 05:13:45.599821 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.600405 kubelet[2951]: W0715 05:13:45.599828 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.600405 kubelet[2951]: E0715 05:13:45.599836 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.600636 kubelet[2951]: E0715 05:13:45.600498 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.600636 kubelet[2951]: W0715 05:13:45.600503 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.600636 kubelet[2951]: E0715 05:13:45.600512 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.600636 kubelet[2951]: E0715 05:13:45.600617 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.600636 kubelet[2951]: W0715 05:13:45.600621 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.600636 kubelet[2951]: E0715 05:13:45.600626 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.600742 kubelet[2951]: E0715 05:13:45.600692 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.600742 kubelet[2951]: W0715 05:13:45.600697 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.600742 kubelet[2951]: E0715 05:13:45.600701 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.600800 kubelet[2951]: E0715 05:13:45.600760 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.600800 kubelet[2951]: W0715 05:13:45.600764 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.600800 kubelet[2951]: E0715 05:13:45.600768 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605492 kubelet[2951]: E0715 05:13:45.600851 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605492 kubelet[2951]: W0715 05:13:45.600855 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605492 kubelet[2951]: E0715 05:13:45.600860 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605492 kubelet[2951]: E0715 05:13:45.601020 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605492 kubelet[2951]: W0715 05:13:45.601034 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605492 kubelet[2951]: E0715 05:13:45.601042 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605492 kubelet[2951]: E0715 05:13:45.601123 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605492 kubelet[2951]: W0715 05:13:45.601128 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605492 kubelet[2951]: E0715 05:13:45.601136 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605492 kubelet[2951]: E0715 05:13:45.601204 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605663 kubelet[2951]: W0715 05:13:45.601208 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605663 kubelet[2951]: E0715 05:13:45.601213 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605663 kubelet[2951]: E0715 05:13:45.601314 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605663 kubelet[2951]: W0715 05:13:45.601319 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605663 kubelet[2951]: E0715 05:13:45.601327 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605663 kubelet[2951]: E0715 05:13:45.601401 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605663 kubelet[2951]: W0715 05:13:45.601406 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605663 kubelet[2951]: E0715 05:13:45.601413 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605663 kubelet[2951]: E0715 05:13:45.601490 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605663 kubelet[2951]: W0715 05:13:45.601495 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605844 kubelet[2951]: E0715 05:13:45.601509 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605844 kubelet[2951]: E0715 05:13:45.601585 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605844 kubelet[2951]: W0715 05:13:45.601590 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605844 kubelet[2951]: E0715 05:13:45.601595 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605844 kubelet[2951]: E0715 05:13:45.601681 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605844 kubelet[2951]: W0715 05:13:45.601686 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605844 kubelet[2951]: E0715 05:13:45.601691 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.605844 kubelet[2951]: E0715 05:13:45.601772 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.605844 kubelet[2951]: W0715 05:13:45.601776 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.605844 kubelet[2951]: E0715 05:13:45.601782 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.606138 kubelet[2951]: E0715 05:13:45.601880 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.606138 kubelet[2951]: W0715 05:13:45.601885 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.606138 kubelet[2951]: E0715 05:13:45.601890 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:45.606138 kubelet[2951]: E0715 05:13:45.602068 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:45.606138 kubelet[2951]: W0715 05:13:45.602073 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:45.606138 kubelet[2951]: E0715 05:13:45.602078 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.496251 kubelet[2951]: I0715 05:13:46.496189 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:13:46.512470 kubelet[2951]: E0715 05:13:46.512444 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.512470 kubelet[2951]: W0715 05:13:46.512462 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.512690 kubelet[2951]: E0715 05:13:46.512479 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.512982 kubelet[2951]: E0715 05:13:46.512949 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.512982 kubelet[2951]: W0715 05:13:46.512961 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.512982 kubelet[2951]: E0715 05:13:46.512970 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.513864 kubelet[2951]: E0715 05:13:46.513837 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.513864 kubelet[2951]: W0715 05:13:46.513849 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.513864 kubelet[2951]: E0715 05:13:46.513861 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.513990 kubelet[2951]: E0715 05:13:46.513982 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.513990 kubelet[2951]: W0715 05:13:46.513987 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.514047 kubelet[2951]: E0715 05:13:46.513992 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.514116 kubelet[2951]: E0715 05:13:46.514073 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.514116 kubelet[2951]: W0715 05:13:46.514078 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.514116 kubelet[2951]: E0715 05:13:46.514083 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.514198 kubelet[2951]: E0715 05:13:46.514162 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.514198 kubelet[2951]: W0715 05:13:46.514167 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.514198 kubelet[2951]: E0715 05:13:46.514174 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.514276 kubelet[2951]: E0715 05:13:46.514268 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.514276 kubelet[2951]: W0715 05:13:46.514275 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.514331 kubelet[2951]: E0715 05:13:46.514282 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.515187 kubelet[2951]: E0715 05:13:46.514369 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.515187 kubelet[2951]: W0715 05:13:46.514376 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.515187 kubelet[2951]: E0715 05:13:46.514381 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.515187 kubelet[2951]: E0715 05:13:46.514474 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.515187 kubelet[2951]: W0715 05:13:46.514479 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.515187 kubelet[2951]: E0715 05:13:46.514483 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.515187 kubelet[2951]: E0715 05:13:46.514569 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.515187 kubelet[2951]: W0715 05:13:46.514576 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.515187 kubelet[2951]: E0715 05:13:46.514581 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.515187 kubelet[2951]: E0715 05:13:46.514706 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.518302 kubelet[2951]: W0715 05:13:46.514712 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.518302 kubelet[2951]: E0715 05:13:46.514718 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.518302 kubelet[2951]: E0715 05:13:46.514830 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.518302 kubelet[2951]: W0715 05:13:46.514834 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.518302 kubelet[2951]: E0715 05:13:46.514839 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.518302 kubelet[2951]: E0715 05:13:46.514937 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.518302 kubelet[2951]: W0715 05:13:46.514942 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.518302 kubelet[2951]: E0715 05:13:46.514948 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.518302 kubelet[2951]: E0715 05:13:46.515030 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.518302 kubelet[2951]: W0715 05:13:46.515038 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.518968 kubelet[2951]: E0715 05:13:46.515045 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.518968 kubelet[2951]: E0715 05:13:46.515123 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.518968 kubelet[2951]: W0715 05:13:46.515129 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.518968 kubelet[2951]: E0715 05:13:46.515135 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.585283 containerd[1622]: time="2025-07-15T05:13:46.585209644Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:46.585713 containerd[1622]: time="2025-07-15T05:13:46.585698120Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Jul 15 05:13:46.586071 containerd[1622]: time="2025-07-15T05:13:46.585978360Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:46.587214 containerd[1622]: time="2025-07-15T05:13:46.587183414Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:46.587772 containerd[1622]: time="2025-07-15T05:13:46.587565284Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.717681042s" Jul 15 05:13:46.587772 containerd[1622]: time="2025-07-15T05:13:46.587590300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Jul 15 05:13:46.589457 containerd[1622]: time="2025-07-15T05:13:46.589435355Z" level=info msg="CreateContainer within sandbox \"86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 15 05:13:46.604554 containerd[1622]: time="2025-07-15T05:13:46.603773703Z" level=info msg="Container 6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:46.606703 kubelet[2951]: E0715 05:13:46.606688 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.606754 kubelet[2951]: W0715 05:13:46.606702 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.606754 kubelet[2951]: E0715 05:13:46.606715 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.616380 kubelet[2951]: E0715 05:13:46.616359 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.616380 kubelet[2951]: W0715 05:13:46.616375 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.616576 kubelet[2951]: E0715 05:13:46.616393 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.616576 kubelet[2951]: E0715 05:13:46.616484 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.616576 kubelet[2951]: W0715 05:13:46.616489 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.616576 kubelet[2951]: E0715 05:13:46.616494 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.616751 kubelet[2951]: E0715 05:13:46.616676 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.616751 kubelet[2951]: W0715 05:13:46.616684 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.616751 kubelet[2951]: E0715 05:13:46.616699 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.616875 kubelet[2951]: E0715 05:13:46.616825 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.616875 kubelet[2951]: W0715 05:13:46.616832 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.616875 kubelet[2951]: E0715 05:13:46.616837 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.617014 kubelet[2951]: E0715 05:13:46.616983 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.617014 kubelet[2951]: W0715 05:13:46.616989 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.617014 kubelet[2951]: E0715 05:13:46.616999 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.617085 kubelet[2951]: E0715 05:13:46.617076 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.617085 kubelet[2951]: W0715 05:13:46.617081 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.617158 kubelet[2951]: E0715 05:13:46.617089 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.617196 kubelet[2951]: E0715 05:13:46.617158 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.617196 kubelet[2951]: W0715 05:13:46.617163 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.617196 kubelet[2951]: E0715 05:13:46.617170 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.617272 kubelet[2951]: E0715 05:13:46.617231 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.617272 kubelet[2951]: W0715 05:13:46.617235 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.617272 kubelet[2951]: E0715 05:13:46.617242 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.617355 kubelet[2951]: E0715 05:13:46.617324 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.617355 kubelet[2951]: W0715 05:13:46.617329 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.617355 kubelet[2951]: E0715 05:13:46.617335 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.617495 kubelet[2951]: E0715 05:13:46.617489 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.617615 kubelet[2951]: W0715 05:13:46.617553 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.617615 kubelet[2951]: E0715 05:13:46.617569 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.617746 kubelet[2951]: E0715 05:13:46.617705 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.617746 kubelet[2951]: W0715 05:13:46.617711 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.617746 kubelet[2951]: E0715 05:13:46.617721 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.617884 kubelet[2951]: E0715 05:13:46.617878 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.617971 kubelet[2951]: W0715 05:13:46.617923 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.617971 kubelet[2951]: E0715 05:13:46.617936 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.618244 kubelet[2951]: E0715 05:13:46.618125 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.618244 kubelet[2951]: W0715 05:13:46.618132 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.618244 kubelet[2951]: E0715 05:13:46.618140 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.618303 kubelet[2951]: E0715 05:13:46.618298 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.618328 kubelet[2951]: W0715 05:13:46.618304 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.618328 kubelet[2951]: E0715 05:13:46.618313 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.618545 kubelet[2951]: E0715 05:13:46.618536 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.618545 kubelet[2951]: W0715 05:13:46.618545 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.618591 kubelet[2951]: E0715 05:13:46.618553 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.618848 kubelet[2951]: E0715 05:13:46.618838 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.618848 kubelet[2951]: W0715 05:13:46.618847 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.618893 kubelet[2951]: E0715 05:13:46.618857 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.618999 kubelet[2951]: E0715 05:13:46.618990 2951 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:13:46.618999 kubelet[2951]: W0715 05:13:46.618998 2951 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:13:46.619037 kubelet[2951]: E0715 05:13:46.619003 2951 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:13:46.623911 containerd[1622]: time="2025-07-15T05:13:46.623869189Z" level=info msg="CreateContainer within sandbox \"86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198\"" Jul 15 05:13:46.624211 containerd[1622]: time="2025-07-15T05:13:46.624196401Z" level=info msg="StartContainer for \"6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198\"" Jul 15 05:13:46.626118 containerd[1622]: time="2025-07-15T05:13:46.626091967Z" level=info msg="connecting to shim 6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198" address="unix:///run/containerd/s/287f262e2d491e35f5e22f03e573d09adadf2521689a01692659c19a28d97c8b" protocol=ttrpc version=3 Jul 15 05:13:46.642997 systemd[1]: Started cri-containerd-6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198.scope - libcontainer container 6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198. Jul 15 05:13:46.673262 systemd[1]: cri-containerd-6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198.scope: Deactivated successfully. Jul 15 05:13:46.681552 containerd[1622]: time="2025-07-15T05:13:46.681479736Z" level=info msg="StartContainer for \"6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198\" returns successfully" Jul 15 05:13:46.690243 containerd[1622]: time="2025-07-15T05:13:46.690203422Z" level=info msg="received exit event container_id:\"6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198\" id:\"6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198\" pid:3644 exited_at:{seconds:1752556426 nanos:674719671}" Jul 15 05:13:46.693665 containerd[1622]: time="2025-07-15T05:13:46.693572313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198\" id:\"6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198\" pid:3644 exited_at:{seconds:1752556426 nanos:674719671}" Jul 15 05:13:46.709181 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6fcf89e311816fb62cbcb1ddc64343c802fd8c8adcee60695438d1616875f198-rootfs.mount: Deactivated successfully. Jul 15 05:13:47.317834 kubelet[2951]: E0715 05:13:47.317800 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5cx6x" podUID="42f78827-b3e3-4b25-9464-bcf35efc21f2" Jul 15 05:13:47.494992 containerd[1622]: time="2025-07-15T05:13:47.494918114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 15 05:13:49.318160 kubelet[2951]: E0715 05:13:49.317929 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5cx6x" podUID="42f78827-b3e3-4b25-9464-bcf35efc21f2" Jul 15 05:13:50.521242 containerd[1622]: time="2025-07-15T05:13:50.521182155Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:50.521818 containerd[1622]: time="2025-07-15T05:13:50.521749287Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Jul 15 05:13:50.522081 containerd[1622]: time="2025-07-15T05:13:50.522065870Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:50.523121 containerd[1622]: time="2025-07-15T05:13:50.523088353Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:50.523702 containerd[1622]: time="2025-07-15T05:13:50.523473637Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 3.028527313s" Jul 15 05:13:50.523702 containerd[1622]: time="2025-07-15T05:13:50.523489655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Jul 15 05:13:50.525678 containerd[1622]: time="2025-07-15T05:13:50.525650717Z" level=info msg="CreateContainer within sandbox \"86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 15 05:13:50.531920 containerd[1622]: time="2025-07-15T05:13:50.531501331Z" level=info msg="Container 8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:50.533919 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount867174430.mount: Deactivated successfully. Jul 15 05:13:50.540209 containerd[1622]: time="2025-07-15T05:13:50.540164731Z" level=info msg="CreateContainer within sandbox \"86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710\"" Jul 15 05:13:50.544865 containerd[1622]: time="2025-07-15T05:13:50.544837155Z" level=info msg="StartContainer for \"8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710\"" Jul 15 05:13:50.546753 containerd[1622]: time="2025-07-15T05:13:50.546730314Z" level=info msg="connecting to shim 8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710" address="unix:///run/containerd/s/287f262e2d491e35f5e22f03e573d09adadf2521689a01692659c19a28d97c8b" protocol=ttrpc version=3 Jul 15 05:13:50.568061 systemd[1]: Started cri-containerd-8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710.scope - libcontainer container 8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710. Jul 15 05:13:50.596913 containerd[1622]: time="2025-07-15T05:13:50.596810243Z" level=info msg="StartContainer for \"8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710\" returns successfully" Jul 15 05:13:50.972090 kubelet[2951]: I0715 05:13:50.972000 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:13:51.318264 kubelet[2951]: E0715 05:13:51.318163 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5cx6x" podUID="42f78827-b3e3-4b25-9464-bcf35efc21f2" Jul 15 05:13:52.278570 systemd[1]: cri-containerd-8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710.scope: Deactivated successfully. Jul 15 05:13:52.278747 systemd[1]: cri-containerd-8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710.scope: Consumed 307ms CPU time, 163.5M memory peak, 12K read from disk, 171.2M written to disk. Jul 15 05:13:52.359283 containerd[1622]: time="2025-07-15T05:13:52.359204884Z" level=info msg="received exit event container_id:\"8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710\" id:\"8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710\" pid:3702 exited_at:{seconds:1752556432 nanos:359059993}" Jul 15 05:13:52.361977 containerd[1622]: time="2025-07-15T05:13:52.359874473Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710\" id:\"8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710\" pid:3702 exited_at:{seconds:1752556432 nanos:359059993}" Jul 15 05:13:52.396033 kubelet[2951]: I0715 05:13:52.396011 2951 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jul 15 05:13:52.418827 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710-rootfs.mount: Deactivated successfully. Jul 15 05:13:52.467225 containerd[1622]: time="2025-07-15T05:13:52.467194470Z" level=error msg="collecting metrics for 8298a4d9c96b175d9bbf8e5d8c4abd114906117e59f9742a8bb797bb7c496710" error="ttrpc: closed" Jul 15 05:13:52.474127 systemd[1]: Created slice kubepods-besteffort-pod0e8612d6_0455_4a88_b312_2f91253d889c.slice - libcontainer container kubepods-besteffort-pod0e8612d6_0455_4a88_b312_2f91253d889c.slice. Jul 15 05:13:52.490704 systemd[1]: Created slice kubepods-burstable-pod1f9d996a_a917_4f78_a806_a2b8173db749.slice - libcontainer container kubepods-burstable-pod1f9d996a_a917_4f78_a806_a2b8173db749.slice. Jul 15 05:13:52.496871 systemd[1]: Created slice kubepods-besteffort-pod70fda2d5_6b0b_4f67_9af3_2ff1dbc1d5a8.slice - libcontainer container kubepods-besteffort-pod70fda2d5_6b0b_4f67_9af3_2ff1dbc1d5a8.slice. Jul 15 05:13:52.501135 systemd[1]: Created slice kubepods-burstable-podeb8c1a88_dd9b_4100_b3af_e3f0a218078d.slice - libcontainer container kubepods-burstable-podeb8c1a88_dd9b_4100_b3af_e3f0a218078d.slice. Jul 15 05:13:52.509088 systemd[1]: Created slice kubepods-besteffort-pod21b748ad_b2a9_4a0e_8b89_475f2acae024.slice - libcontainer container kubepods-besteffort-pod21b748ad_b2a9_4a0e_8b89_475f2acae024.slice. Jul 15 05:13:52.513487 systemd[1]: Created slice kubepods-besteffort-pod77f25f3e_17e6_48a5_bd0c_d1425f534cb3.slice - libcontainer container kubepods-besteffort-pod77f25f3e_17e6_48a5_bd0c_d1425f534cb3.slice. Jul 15 05:13:52.519890 systemd[1]: Created slice kubepods-besteffort-pod20e10742_00f4_4932_9581_20e919e16af1.slice - libcontainer container kubepods-besteffort-pod20e10742_00f4_4932_9581_20e919e16af1.slice. Jul 15 05:13:52.525095 systemd[1]: Created slice kubepods-besteffort-podf6387e58_6994_4983_b232_ec13cc1ee6a6.slice - libcontainer container kubepods-besteffort-podf6387e58_6994_4983_b232_ec13cc1ee6a6.slice. Jul 15 05:13:52.545185 containerd[1622]: time="2025-07-15T05:13:52.544993985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 15 05:13:52.562259 kubelet[2951]: I0715 05:13:52.562176 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f6387e58-6994-4983-b232-ec13cc1ee6a6-goldmane-key-pair\") pod \"goldmane-58fd7646b9-8nvjp\" (UID: \"f6387e58-6994-4983-b232-ec13cc1ee6a6\") " pod="calico-system/goldmane-58fd7646b9-8nvjp" Jul 15 05:13:52.563515 kubelet[2951]: I0715 05:13:52.562376 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw455\" (UniqueName: \"kubernetes.io/projected/1f9d996a-a917-4f78-a806-a2b8173db749-kube-api-access-hw455\") pod \"coredns-7c65d6cfc9-lf4wk\" (UID: \"1f9d996a-a917-4f78-a806-a2b8173db749\") " pod="kube-system/coredns-7c65d6cfc9-lf4wk" Jul 15 05:13:52.563515 kubelet[2951]: I0715 05:13:52.562558 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6387e58-6994-4983-b232-ec13cc1ee6a6-config\") pod \"goldmane-58fd7646b9-8nvjp\" (UID: \"f6387e58-6994-4983-b232-ec13cc1ee6a6\") " pod="calico-system/goldmane-58fd7646b9-8nvjp" Jul 15 05:13:52.563515 kubelet[2951]: I0715 05:13:52.562582 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/21b748ad-b2a9-4a0e-8b89-475f2acae024-calico-apiserver-certs\") pod \"calico-apiserver-849d984976-9xsth\" (UID: \"21b748ad-b2a9-4a0e-8b89-475f2acae024\") " pod="calico-apiserver/calico-apiserver-849d984976-9xsth" Jul 15 05:13:52.563515 kubelet[2951]: I0715 05:13:52.562593 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72d9t\" (UniqueName: \"kubernetes.io/projected/77f25f3e-17e6-48a5-bd0c-d1425f534cb3-kube-api-access-72d9t\") pod \"calico-apiserver-5596dfd6cb-c84wc\" (UID: \"77f25f3e-17e6-48a5-bd0c-d1425f534cb3\") " pod="calico-apiserver/calico-apiserver-5596dfd6cb-c84wc" Jul 15 05:13:52.563515 kubelet[2951]: I0715 05:13:52.562604 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rcq7\" (UniqueName: \"kubernetes.io/projected/f6387e58-6994-4983-b232-ec13cc1ee6a6-kube-api-access-8rcq7\") pod \"goldmane-58fd7646b9-8nvjp\" (UID: \"f6387e58-6994-4983-b232-ec13cc1ee6a6\") " pod="calico-system/goldmane-58fd7646b9-8nvjp" Jul 15 05:13:52.564000 kubelet[2951]: I0715 05:13:52.562613 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-whisker-backend-key-pair\") pod \"whisker-76c4b86c6b-hq6cw\" (UID: \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\") " pod="calico-system/whisker-76c4b86c6b-hq6cw" Jul 15 05:13:52.564000 kubelet[2951]: I0715 05:13:52.562622 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcmsw\" (UniqueName: \"kubernetes.io/projected/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-kube-api-access-dcmsw\") pod \"whisker-76c4b86c6b-hq6cw\" (UID: \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\") " pod="calico-system/whisker-76c4b86c6b-hq6cw" Jul 15 05:13:52.564000 kubelet[2951]: I0715 05:13:52.562630 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6387e58-6994-4983-b232-ec13cc1ee6a6-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-8nvjp\" (UID: \"f6387e58-6994-4983-b232-ec13cc1ee6a6\") " pod="calico-system/goldmane-58fd7646b9-8nvjp" Jul 15 05:13:52.564000 kubelet[2951]: I0715 05:13:52.562645 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbx8h\" (UniqueName: \"kubernetes.io/projected/eb8c1a88-dd9b-4100-b3af-e3f0a218078d-kube-api-access-mbx8h\") pod \"coredns-7c65d6cfc9-2kxrm\" (UID: \"eb8c1a88-dd9b-4100-b3af-e3f0a218078d\") " pod="kube-system/coredns-7c65d6cfc9-2kxrm" Jul 15 05:13:52.564000 kubelet[2951]: I0715 05:13:52.562654 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e8612d6-0455-4a88-b312-2f91253d889c-tigera-ca-bundle\") pod \"calico-kube-controllers-64cfc7b9f4-zc956\" (UID: \"0e8612d6-0455-4a88-b312-2f91253d889c\") " pod="calico-system/calico-kube-controllers-64cfc7b9f4-zc956" Jul 15 05:13:52.564127 kubelet[2951]: I0715 05:13:52.562681 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4m4n\" (UniqueName: \"kubernetes.io/projected/21b748ad-b2a9-4a0e-8b89-475f2acae024-kube-api-access-s4m4n\") pod \"calico-apiserver-849d984976-9xsth\" (UID: \"21b748ad-b2a9-4a0e-8b89-475f2acae024\") " pod="calico-apiserver/calico-apiserver-849d984976-9xsth" Jul 15 05:13:52.564127 kubelet[2951]: I0715 05:13:52.562693 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/20e10742-00f4-4932-9581-20e919e16af1-calico-apiserver-certs\") pod \"calico-apiserver-849d984976-ml4gq\" (UID: \"20e10742-00f4-4932-9581-20e919e16af1\") " pod="calico-apiserver/calico-apiserver-849d984976-ml4gq" Jul 15 05:13:52.564127 kubelet[2951]: I0715 05:13:52.562703 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb8c1a88-dd9b-4100-b3af-e3f0a218078d-config-volume\") pod \"coredns-7c65d6cfc9-2kxrm\" (UID: \"eb8c1a88-dd9b-4100-b3af-e3f0a218078d\") " pod="kube-system/coredns-7c65d6cfc9-2kxrm" Jul 15 05:13:52.564127 kubelet[2951]: I0715 05:13:52.562713 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvm68\" (UniqueName: \"kubernetes.io/projected/0e8612d6-0455-4a88-b312-2f91253d889c-kube-api-access-jvm68\") pod \"calico-kube-controllers-64cfc7b9f4-zc956\" (UID: \"0e8612d6-0455-4a88-b312-2f91253d889c\") " pod="calico-system/calico-kube-controllers-64cfc7b9f4-zc956" Jul 15 05:13:52.564127 kubelet[2951]: I0715 05:13:52.562723 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-whisker-ca-bundle\") pod \"whisker-76c4b86c6b-hq6cw\" (UID: \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\") " pod="calico-system/whisker-76c4b86c6b-hq6cw" Jul 15 05:13:52.564263 kubelet[2951]: I0715 05:13:52.562732 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fgzb\" (UniqueName: \"kubernetes.io/projected/20e10742-00f4-4932-9581-20e919e16af1-kube-api-access-6fgzb\") pod \"calico-apiserver-849d984976-ml4gq\" (UID: \"20e10742-00f4-4932-9581-20e919e16af1\") " pod="calico-apiserver/calico-apiserver-849d984976-ml4gq" Jul 15 05:13:52.564263 kubelet[2951]: I0715 05:13:52.562755 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f9d996a-a917-4f78-a806-a2b8173db749-config-volume\") pod \"coredns-7c65d6cfc9-lf4wk\" (UID: \"1f9d996a-a917-4f78-a806-a2b8173db749\") " pod="kube-system/coredns-7c65d6cfc9-lf4wk" Jul 15 05:13:52.564263 kubelet[2951]: I0715 05:13:52.562770 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/77f25f3e-17e6-48a5-bd0c-d1425f534cb3-calico-apiserver-certs\") pod \"calico-apiserver-5596dfd6cb-c84wc\" (UID: \"77f25f3e-17e6-48a5-bd0c-d1425f534cb3\") " pod="calico-apiserver/calico-apiserver-5596dfd6cb-c84wc" Jul 15 05:13:52.792078 containerd[1622]: time="2025-07-15T05:13:52.791880540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-64cfc7b9f4-zc956,Uid:0e8612d6-0455-4a88-b312-2f91253d889c,Namespace:calico-system,Attempt:0,}" Jul 15 05:13:52.796702 containerd[1622]: time="2025-07-15T05:13:52.796474039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lf4wk,Uid:1f9d996a-a917-4f78-a806-a2b8173db749,Namespace:kube-system,Attempt:0,}" Jul 15 05:13:52.801049 containerd[1622]: time="2025-07-15T05:13:52.801031329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76c4b86c6b-hq6cw,Uid:70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8,Namespace:calico-system,Attempt:0,}" Jul 15 05:13:52.814407 containerd[1622]: time="2025-07-15T05:13:52.814384809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2kxrm,Uid:eb8c1a88-dd9b-4100-b3af-e3f0a218078d,Namespace:kube-system,Attempt:0,}" Jul 15 05:13:52.834644 containerd[1622]: time="2025-07-15T05:13:52.834607221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-8nvjp,Uid:f6387e58-6994-4983-b232-ec13cc1ee6a6,Namespace:calico-system,Attempt:0,}" Jul 15 05:13:52.835573 containerd[1622]: time="2025-07-15T05:13:52.835319194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5596dfd6cb-c84wc,Uid:77f25f3e-17e6-48a5-bd0c-d1425f534cb3,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:13:52.835609 containerd[1622]: time="2025-07-15T05:13:52.835484091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-849d984976-9xsth,Uid:21b748ad-b2a9-4a0e-8b89-475f2acae024,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:13:52.836597 containerd[1622]: time="2025-07-15T05:13:52.835759085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-849d984976-ml4gq,Uid:20e10742-00f4-4932-9581-20e919e16af1,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:13:53.103173 containerd[1622]: time="2025-07-15T05:13:53.102884114Z" level=error msg="Failed to destroy network for sandbox \"650234eada99bc30a7fcfbe11d6f15517785a6220d76858385a7f2a5271c3902\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.116444 containerd[1622]: time="2025-07-15T05:13:53.104831018Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2kxrm,Uid:eb8c1a88-dd9b-4100-b3af-e3f0a218078d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"650234eada99bc30a7fcfbe11d6f15517785a6220d76858385a7f2a5271c3902\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.117197 containerd[1622]: time="2025-07-15T05:13:53.105534083Z" level=error msg="Failed to destroy network for sandbox \"9979352c2ff33a7701a8f4b49d0bfffc1851ce33bba7cb69b74796aecab96667\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.117197 containerd[1622]: time="2025-07-15T05:13:53.116960834Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-8nvjp,Uid:f6387e58-6994-4983-b232-ec13cc1ee6a6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9979352c2ff33a7701a8f4b49d0bfffc1851ce33bba7cb69b74796aecab96667\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.120098 containerd[1622]: time="2025-07-15T05:13:53.107303327Z" level=error msg="Failed to destroy network for sandbox \"c6511070c2ee74689cc70e3c0c9d6f52db812cf93fecbd0f20849a38754357be\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.120973 containerd[1622]: time="2025-07-15T05:13:53.111193947Z" level=error msg="Failed to destroy network for sandbox \"c048f5678428c3bda9479478592055281c1cecfca0fa7c757315afb09d61555e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.121059 containerd[1622]: time="2025-07-15T05:13:53.121031708Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-64cfc7b9f4-zc956,Uid:0e8612d6-0455-4a88-b312-2f91253d889c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6511070c2ee74689cc70e3c0c9d6f52db812cf93fecbd0f20849a38754357be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.121146 containerd[1622]: time="2025-07-15T05:13:53.112813758Z" level=error msg="Failed to destroy network for sandbox \"ff175ff42ac0bffcae8047c1587ebdffe2d0e9def691146690d53accdbbbe8e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.121414 containerd[1622]: time="2025-07-15T05:13:53.113096411Z" level=error msg="Failed to destroy network for sandbox \"f24005755038f1e0ed41eaef98f156bae59e552182ed67215b3389b4f8862e84\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.121551 containerd[1622]: time="2025-07-15T05:13:53.121538472Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-849d984976-9xsth,Uid:21b748ad-b2a9-4a0e-8b89-475f2acae024,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c048f5678428c3bda9479478592055281c1cecfca0fa7c757315afb09d61555e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.121824 containerd[1622]: time="2025-07-15T05:13:53.121811074Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lf4wk,Uid:1f9d996a-a917-4f78-a806-a2b8173db749,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff175ff42ac0bffcae8047c1587ebdffe2d0e9def691146690d53accdbbbe8e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.122645 containerd[1622]: time="2025-07-15T05:13:53.114005722Z" level=error msg="Failed to destroy network for sandbox \"8d9c63d0c3dc55e91d5d7bddf615b3c155b6a64910fc7bd74763ad14c13b0ffe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.122645 containerd[1622]: time="2025-07-15T05:13:53.114995144Z" level=error msg="Failed to destroy network for sandbox \"64a32748c82d8786c19f627d64a3d059cc6064a96606edac0941dfd92f64d64a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.122809 containerd[1622]: time="2025-07-15T05:13:53.122709474Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76c4b86c6b-hq6cw,Uid:70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f24005755038f1e0ed41eaef98f156bae59e552182ed67215b3389b4f8862e84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.123288 containerd[1622]: time="2025-07-15T05:13:53.123012961Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5596dfd6cb-c84wc,Uid:77f25f3e-17e6-48a5-bd0c-d1425f534cb3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d9c63d0c3dc55e91d5d7bddf615b3c155b6a64910fc7bd74763ad14c13b0ffe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.123377 containerd[1622]: time="2025-07-15T05:13:53.123345548Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-849d984976-ml4gq,Uid:20e10742-00f4-4932-9581-20e919e16af1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"64a32748c82d8786c19f627d64a3d059cc6064a96606edac0941dfd92f64d64a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.123715 kubelet[2951]: E0715 05:13:53.123682 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6511070c2ee74689cc70e3c0c9d6f52db812cf93fecbd0f20849a38754357be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.123808 kubelet[2951]: E0715 05:13:53.123783 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d9c63d0c3dc55e91d5d7bddf615b3c155b6a64910fc7bd74763ad14c13b0ffe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.126994 kubelet[2951]: E0715 05:13:53.126890 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6511070c2ee74689cc70e3c0c9d6f52db812cf93fecbd0f20849a38754357be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-64cfc7b9f4-zc956" Jul 15 05:13:53.126994 kubelet[2951]: E0715 05:13:53.126923 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6511070c2ee74689cc70e3c0c9d6f52db812cf93fecbd0f20849a38754357be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-64cfc7b9f4-zc956" Jul 15 05:13:53.127717 kubelet[2951]: E0715 05:13:53.127431 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d9c63d0c3dc55e91d5d7bddf615b3c155b6a64910fc7bd74763ad14c13b0ffe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5596dfd6cb-c84wc" Jul 15 05:13:53.127717 kubelet[2951]: E0715 05:13:53.127454 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d9c63d0c3dc55e91d5d7bddf615b3c155b6a64910fc7bd74763ad14c13b0ffe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5596dfd6cb-c84wc" Jul 15 05:13:53.127771 kubelet[2951]: E0715 05:13:53.127711 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5596dfd6cb-c84wc_calico-apiserver(77f25f3e-17e6-48a5-bd0c-d1425f534cb3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5596dfd6cb-c84wc_calico-apiserver(77f25f3e-17e6-48a5-bd0c-d1425f534cb3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d9c63d0c3dc55e91d5d7bddf615b3c155b6a64910fc7bd74763ad14c13b0ffe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5596dfd6cb-c84wc" podUID="77f25f3e-17e6-48a5-bd0c-d1425f534cb3" Jul 15 05:13:53.127771 kubelet[2951]: E0715 05:13:53.127747 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c048f5678428c3bda9479478592055281c1cecfca0fa7c757315afb09d61555e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.127771 kubelet[2951]: E0715 05:13:53.127762 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c048f5678428c3bda9479478592055281c1cecfca0fa7c757315afb09d61555e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-849d984976-9xsth" Jul 15 05:13:53.127847 kubelet[2951]: E0715 05:13:53.127770 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c048f5678428c3bda9479478592055281c1cecfca0fa7c757315afb09d61555e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-849d984976-9xsth" Jul 15 05:13:53.127847 kubelet[2951]: E0715 05:13:53.127783 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-849d984976-9xsth_calico-apiserver(21b748ad-b2a9-4a0e-8b89-475f2acae024)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-849d984976-9xsth_calico-apiserver(21b748ad-b2a9-4a0e-8b89-475f2acae024)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c048f5678428c3bda9479478592055281c1cecfca0fa7c757315afb09d61555e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-849d984976-9xsth" podUID="21b748ad-b2a9-4a0e-8b89-475f2acae024" Jul 15 05:13:53.127847 kubelet[2951]: E0715 05:13:53.127804 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff175ff42ac0bffcae8047c1587ebdffe2d0e9def691146690d53accdbbbe8e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.127937 kubelet[2951]: E0715 05:13:53.127814 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff175ff42ac0bffcae8047c1587ebdffe2d0e9def691146690d53accdbbbe8e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-lf4wk" Jul 15 05:13:53.127937 kubelet[2951]: E0715 05:13:53.127824 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff175ff42ac0bffcae8047c1587ebdffe2d0e9def691146690d53accdbbbe8e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-lf4wk" Jul 15 05:13:53.127937 kubelet[2951]: E0715 05:13:53.127838 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-lf4wk_kube-system(1f9d996a-a917-4f78-a806-a2b8173db749)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-lf4wk_kube-system(1f9d996a-a917-4f78-a806-a2b8173db749)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ff175ff42ac0bffcae8047c1587ebdffe2d0e9def691146690d53accdbbbe8e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-lf4wk" podUID="1f9d996a-a917-4f78-a806-a2b8173db749" Jul 15 05:13:53.128007 kubelet[2951]: E0715 05:13:53.127853 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f24005755038f1e0ed41eaef98f156bae59e552182ed67215b3389b4f8862e84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.128007 kubelet[2951]: E0715 05:13:53.127862 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f24005755038f1e0ed41eaef98f156bae59e552182ed67215b3389b4f8862e84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76c4b86c6b-hq6cw" Jul 15 05:13:53.128007 kubelet[2951]: E0715 05:13:53.127870 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f24005755038f1e0ed41eaef98f156bae59e552182ed67215b3389b4f8862e84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76c4b86c6b-hq6cw" Jul 15 05:13:53.128068 kubelet[2951]: E0715 05:13:53.127882 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-76c4b86c6b-hq6cw_calico-system(70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-76c4b86c6b-hq6cw_calico-system(70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f24005755038f1e0ed41eaef98f156bae59e552182ed67215b3389b4f8862e84\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-76c4b86c6b-hq6cw" podUID="70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8" Jul 15 05:13:53.128404 kubelet[2951]: E0715 05:13:53.128389 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-64cfc7b9f4-zc956_calico-system(0e8612d6-0455-4a88-b312-2f91253d889c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-64cfc7b9f4-zc956_calico-system(0e8612d6-0455-4a88-b312-2f91253d889c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c6511070c2ee74689cc70e3c0c9d6f52db812cf93fecbd0f20849a38754357be\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-64cfc7b9f4-zc956" podUID="0e8612d6-0455-4a88-b312-2f91253d889c" Jul 15 05:13:53.128476 kubelet[2951]: E0715 05:13:53.128465 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"650234eada99bc30a7fcfbe11d6f15517785a6220d76858385a7f2a5271c3902\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.128521 kubelet[2951]: E0715 05:13:53.128513 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"650234eada99bc30a7fcfbe11d6f15517785a6220d76858385a7f2a5271c3902\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-2kxrm" Jul 15 05:13:53.128562 kubelet[2951]: E0715 05:13:53.128554 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"650234eada99bc30a7fcfbe11d6f15517785a6220d76858385a7f2a5271c3902\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-2kxrm" Jul 15 05:13:53.128614 kubelet[2951]: E0715 05:13:53.128603 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-2kxrm_kube-system(eb8c1a88-dd9b-4100-b3af-e3f0a218078d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-2kxrm_kube-system(eb8c1a88-dd9b-4100-b3af-e3f0a218078d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"650234eada99bc30a7fcfbe11d6f15517785a6220d76858385a7f2a5271c3902\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-2kxrm" podUID="eb8c1a88-dd9b-4100-b3af-e3f0a218078d" Jul 15 05:13:53.128685 kubelet[2951]: E0715 05:13:53.128675 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9979352c2ff33a7701a8f4b49d0bfffc1851ce33bba7cb69b74796aecab96667\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.128734 kubelet[2951]: E0715 05:13:53.128723 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9979352c2ff33a7701a8f4b49d0bfffc1851ce33bba7cb69b74796aecab96667\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-8nvjp" Jul 15 05:13:53.128877 kubelet[2951]: E0715 05:13:53.128767 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9979352c2ff33a7701a8f4b49d0bfffc1851ce33bba7cb69b74796aecab96667\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-8nvjp" Jul 15 05:13:53.128877 kubelet[2951]: E0715 05:13:53.128784 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-8nvjp_calico-system(f6387e58-6994-4983-b232-ec13cc1ee6a6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-8nvjp_calico-system(f6387e58-6994-4983-b232-ec13cc1ee6a6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9979352c2ff33a7701a8f4b49d0bfffc1851ce33bba7cb69b74796aecab96667\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-8nvjp" podUID="f6387e58-6994-4983-b232-ec13cc1ee6a6" Jul 15 05:13:53.128877 kubelet[2951]: E0715 05:13:53.128825 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64a32748c82d8786c19f627d64a3d059cc6064a96606edac0941dfd92f64d64a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.128962 kubelet[2951]: E0715 05:13:53.128837 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64a32748c82d8786c19f627d64a3d059cc6064a96606edac0941dfd92f64d64a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-849d984976-ml4gq" Jul 15 05:13:53.128962 kubelet[2951]: E0715 05:13:53.128847 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64a32748c82d8786c19f627d64a3d059cc6064a96606edac0941dfd92f64d64a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-849d984976-ml4gq" Jul 15 05:13:53.128962 kubelet[2951]: E0715 05:13:53.128862 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-849d984976-ml4gq_calico-apiserver(20e10742-00f4-4932-9581-20e919e16af1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-849d984976-ml4gq_calico-apiserver(20e10742-00f4-4932-9581-20e919e16af1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"64a32748c82d8786c19f627d64a3d059cc6064a96606edac0941dfd92f64d64a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-849d984976-ml4gq" podUID="20e10742-00f4-4932-9581-20e919e16af1" Jul 15 05:13:53.331874 systemd[1]: Created slice kubepods-besteffort-pod42f78827_b3e3_4b25_9464_bcf35efc21f2.slice - libcontainer container kubepods-besteffort-pod42f78827_b3e3_4b25_9464_bcf35efc21f2.slice. Jul 15 05:13:53.337712 containerd[1622]: time="2025-07-15T05:13:53.337678889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5cx6x,Uid:42f78827-b3e3-4b25-9464-bcf35efc21f2,Namespace:calico-system,Attempt:0,}" Jul 15 05:13:53.372154 containerd[1622]: time="2025-07-15T05:13:53.372120192Z" level=error msg="Failed to destroy network for sandbox \"af1cca249e545c20a999e92112b44d35a660357cfabb30fbcfb9ca8b370da4b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.372867 containerd[1622]: time="2025-07-15T05:13:53.372797345Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5cx6x,Uid:42f78827-b3e3-4b25-9464-bcf35efc21f2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"af1cca249e545c20a999e92112b44d35a660357cfabb30fbcfb9ca8b370da4b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.372958 kubelet[2951]: E0715 05:13:53.372939 2951 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af1cca249e545c20a999e92112b44d35a660357cfabb30fbcfb9ca8b370da4b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:13:53.372997 kubelet[2951]: E0715 05:13:53.372972 2951 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af1cca249e545c20a999e92112b44d35a660357cfabb30fbcfb9ca8b370da4b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5cx6x" Jul 15 05:13:53.372997 kubelet[2951]: E0715 05:13:53.372984 2951 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af1cca249e545c20a999e92112b44d35a660357cfabb30fbcfb9ca8b370da4b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5cx6x" Jul 15 05:13:53.373052 kubelet[2951]: E0715 05:13:53.373012 2951 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5cx6x_calico-system(42f78827-b3e3-4b25-9464-bcf35efc21f2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5cx6x_calico-system(42f78827-b3e3-4b25-9464-bcf35efc21f2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"af1cca249e545c20a999e92112b44d35a660357cfabb30fbcfb9ca8b370da4b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5cx6x" podUID="42f78827-b3e3-4b25-9464-bcf35efc21f2" Jul 15 05:13:57.062039 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount266057617.mount: Deactivated successfully. Jul 15 05:13:57.181109 containerd[1622]: time="2025-07-15T05:13:57.180887840Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 4.634650127s" Jul 15 05:13:57.181109 containerd[1622]: time="2025-07-15T05:13:57.180938061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Jul 15 05:13:57.181109 containerd[1622]: time="2025-07-15T05:13:57.166114122Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Jul 15 05:13:57.204081 containerd[1622]: time="2025-07-15T05:13:57.202983425Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:57.215039 containerd[1622]: time="2025-07-15T05:13:57.214984356Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:57.215943 containerd[1622]: time="2025-07-15T05:13:57.215316100Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:13:57.228117 containerd[1622]: time="2025-07-15T05:13:57.228063832Z" level=info msg="CreateContainer within sandbox \"86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 15 05:13:57.257677 containerd[1622]: time="2025-07-15T05:13:57.257615230Z" level=info msg="Container f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:13:57.259787 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3793437428.mount: Deactivated successfully. Jul 15 05:13:57.302919 containerd[1622]: time="2025-07-15T05:13:57.302872673Z" level=info msg="CreateContainer within sandbox \"86defbb09a8864c797e9968e8d473e4ec35eed3a53e51b60dc86ac1e921408e0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998\"" Jul 15 05:13:57.303706 containerd[1622]: time="2025-07-15T05:13:57.303694337Z" level=info msg="StartContainer for \"f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998\"" Jul 15 05:13:57.309158 containerd[1622]: time="2025-07-15T05:13:57.309134182Z" level=info msg="connecting to shim f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998" address="unix:///run/containerd/s/287f262e2d491e35f5e22f03e573d09adadf2521689a01692659c19a28d97c8b" protocol=ttrpc version=3 Jul 15 05:13:57.439133 systemd[1]: Started cri-containerd-f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998.scope - libcontainer container f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998. Jul 15 05:13:57.552690 containerd[1622]: time="2025-07-15T05:13:57.552650806Z" level=info msg="StartContainer for \"f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998\" returns successfully" Jul 15 05:13:57.628041 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 15 05:13:57.628750 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 15 05:13:57.997535 kubelet[2951]: I0715 05:13:57.997495 2951 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcmsw\" (UniqueName: \"kubernetes.io/projected/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-kube-api-access-dcmsw\") pod \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\" (UID: \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\") " Jul 15 05:13:57.997535 kubelet[2951]: I0715 05:13:57.997524 2951 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-whisker-backend-key-pair\") pod \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\" (UID: \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\") " Jul 15 05:13:57.998149 kubelet[2951]: I0715 05:13:57.997552 2951 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-whisker-ca-bundle\") pod \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\" (UID: \"70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8\") " Jul 15 05:13:58.001723 kubelet[2951]: I0715 05:13:58.001703 2951 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8" (UID: "70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jul 15 05:13:58.006623 kubelet[2951]: I0715 05:13:58.006600 2951 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-kube-api-access-dcmsw" (OuterVolumeSpecName: "kube-api-access-dcmsw") pod "70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8" (UID: "70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8"). InnerVolumeSpecName "kube-api-access-dcmsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 05:13:58.006623 kubelet[2951]: I0715 05:13:58.006602 2951 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8" (UID: "70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 05:13:58.044806 systemd[1]: var-lib-kubelet-pods-70fda2d5\x2d6b0b\x2d4f67\x2d9af3\x2d2ff1dbc1d5a8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddcmsw.mount: Deactivated successfully. Jul 15 05:13:58.044867 systemd[1]: var-lib-kubelet-pods-70fda2d5\x2d6b0b\x2d4f67\x2d9af3\x2d2ff1dbc1d5a8-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 15 05:13:58.097901 kubelet[2951]: I0715 05:13:58.097873 2951 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcmsw\" (UniqueName: \"kubernetes.io/projected/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-kube-api-access-dcmsw\") on node \"localhost\" DevicePath \"\"" Jul 15 05:13:58.098008 kubelet[2951]: I0715 05:13:58.097908 2951 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Jul 15 05:13:58.098008 kubelet[2951]: I0715 05:13:58.097917 2951 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jul 15 05:13:58.323204 systemd[1]: Removed slice kubepods-besteffort-pod70fda2d5_6b0b_4f67_9af3_2ff1dbc1d5a8.slice - libcontainer container kubepods-besteffort-pod70fda2d5_6b0b_4f67_9af3_2ff1dbc1d5a8.slice. Jul 15 05:13:58.587254 kubelet[2951]: I0715 05:13:58.587082 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-dpcjw" podStartSLOduration=2.449512552 podStartE2EDuration="16.587068795s" podCreationTimestamp="2025-07-15 05:13:42 +0000 UTC" firstStartedPulling="2025-07-15 05:13:43.06132008 +0000 UTC m=+20.840630937" lastFinishedPulling="2025-07-15 05:13:57.198876321 +0000 UTC m=+34.978187180" observedRunningTime="2025-07-15 05:13:58.586581444 +0000 UTC m=+36.365892306" watchObservedRunningTime="2025-07-15 05:13:58.587068795 +0000 UTC m=+36.366379650" Jul 15 05:13:58.688527 systemd[1]: Created slice kubepods-besteffort-podc24e108b_2014_4b27_90f7_36b20c6e1f40.slice - libcontainer container kubepods-besteffort-podc24e108b_2014_4b27_90f7_36b20c6e1f40.slice. Jul 15 05:13:58.760758 containerd[1622]: time="2025-07-15T05:13:58.760731812Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998\" id:\"4bb6b89977d429d01a0329988a642bc1c886e76f44663a88c8329d85378ca228\" pid:4071 exit_status:1 exited_at:{seconds:1752556438 nanos:760454645}" Jul 15 05:13:58.809543 kubelet[2951]: I0715 05:13:58.809489 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c24e108b-2014-4b27-90f7-36b20c6e1f40-whisker-ca-bundle\") pod \"whisker-698cd4f4f8-qvkck\" (UID: \"c24e108b-2014-4b27-90f7-36b20c6e1f40\") " pod="calico-system/whisker-698cd4f4f8-qvkck" Jul 15 05:13:58.809543 kubelet[2951]: I0715 05:13:58.809521 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c24e108b-2014-4b27-90f7-36b20c6e1f40-whisker-backend-key-pair\") pod \"whisker-698cd4f4f8-qvkck\" (UID: \"c24e108b-2014-4b27-90f7-36b20c6e1f40\") " pod="calico-system/whisker-698cd4f4f8-qvkck" Jul 15 05:13:58.809685 kubelet[2951]: I0715 05:13:58.809533 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2s5\" (UniqueName: \"kubernetes.io/projected/c24e108b-2014-4b27-90f7-36b20c6e1f40-kube-api-access-nb2s5\") pod \"whisker-698cd4f4f8-qvkck\" (UID: \"c24e108b-2014-4b27-90f7-36b20c6e1f40\") " pod="calico-system/whisker-698cd4f4f8-qvkck" Jul 15 05:13:58.994906 containerd[1622]: time="2025-07-15T05:13:58.994841078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-698cd4f4f8-qvkck,Uid:c24e108b-2014-4b27-90f7-36b20c6e1f40,Namespace:calico-system,Attempt:0,}" Jul 15 05:13:59.471475 systemd-networkd[1532]: vxlan.calico: Link UP Jul 15 05:13:59.471482 systemd-networkd[1532]: vxlan.calico: Gained carrier Jul 15 05:13:59.654420 containerd[1622]: time="2025-07-15T05:13:59.654396709Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998\" id:\"5bb06ae728cbd7009830d7e22839b208a2de042c08a2439eac0650bff10fc15d\" pid:4284 exit_status:1 exited_at:{seconds:1752556439 nanos:654204221}" Jul 15 05:13:59.700203 systemd-networkd[1532]: calic19b60b6d32: Link UP Jul 15 05:13:59.700699 systemd-networkd[1532]: calic19b60b6d32: Gained carrier Jul 15 05:13:59.714290 containerd[1622]: 2025-07-15 05:13:59.069 [INFO][4175] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:13:59.714290 containerd[1622]: 2025-07-15 05:13:59.204 [INFO][4175] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--698cd4f4f8--qvkck-eth0 whisker-698cd4f4f8- calico-system c24e108b-2014-4b27-90f7-36b20c6e1f40 877 0 2025-07-15 05:13:58 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:698cd4f4f8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-698cd4f4f8-qvkck eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calic19b60b6d32 [] [] }} ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Namespace="calico-system" Pod="whisker-698cd4f4f8-qvkck" WorkloadEndpoint="localhost-k8s-whisker--698cd4f4f8--qvkck-" Jul 15 05:13:59.714290 containerd[1622]: 2025-07-15 05:13:59.205 [INFO][4175] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Namespace="calico-system" Pod="whisker-698cd4f4f8-qvkck" WorkloadEndpoint="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" Jul 15 05:13:59.714290 containerd[1622]: 2025-07-15 05:13:59.642 [INFO][4197] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" HandleID="k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Workload="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.649 [INFO][4197] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" HandleID="k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Workload="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103740), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-698cd4f4f8-qvkck", "timestamp":"2025-07-15 05:13:59.642469296 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.649 [INFO][4197] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.650 [INFO][4197] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.651 [INFO][4197] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.672 [INFO][4197] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" host="localhost" Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.680 [INFO][4197] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.683 [INFO][4197] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.684 [INFO][4197] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.685 [INFO][4197] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:13:59.714728 containerd[1622]: 2025-07-15 05:13:59.685 [INFO][4197] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" host="localhost" Jul 15 05:13:59.715539 containerd[1622]: 2025-07-15 05:13:59.686 [INFO][4197] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6 Jul 15 05:13:59.715539 containerd[1622]: 2025-07-15 05:13:59.688 [INFO][4197] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" host="localhost" Jul 15 05:13:59.715539 containerd[1622]: 2025-07-15 05:13:59.692 [INFO][4197] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" host="localhost" Jul 15 05:13:59.715539 containerd[1622]: 2025-07-15 05:13:59.693 [INFO][4197] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" host="localhost" Jul 15 05:13:59.715539 containerd[1622]: 2025-07-15 05:13:59.693 [INFO][4197] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:13:59.715539 containerd[1622]: 2025-07-15 05:13:59.693 [INFO][4197] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" HandleID="k8s-pod-network.c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Workload="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" Jul 15 05:13:59.717012 containerd[1622]: 2025-07-15 05:13:59.695 [INFO][4175] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Namespace="calico-system" Pod="whisker-698cd4f4f8-qvkck" WorkloadEndpoint="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--698cd4f4f8--qvkck-eth0", GenerateName:"whisker-698cd4f4f8-", Namespace:"calico-system", SelfLink:"", UID:"c24e108b-2014-4b27-90f7-36b20c6e1f40", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"698cd4f4f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-698cd4f4f8-qvkck", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic19b60b6d32", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:13:59.717012 containerd[1622]: 2025-07-15 05:13:59.695 [INFO][4175] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Namespace="calico-system" Pod="whisker-698cd4f4f8-qvkck" WorkloadEndpoint="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" Jul 15 05:13:59.717124 containerd[1622]: 2025-07-15 05:13:59.695 [INFO][4175] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic19b60b6d32 ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Namespace="calico-system" Pod="whisker-698cd4f4f8-qvkck" WorkloadEndpoint="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" Jul 15 05:13:59.717124 containerd[1622]: 2025-07-15 05:13:59.705 [INFO][4175] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Namespace="calico-system" Pod="whisker-698cd4f4f8-qvkck" WorkloadEndpoint="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" Jul 15 05:13:59.717787 containerd[1622]: 2025-07-15 05:13:59.705 [INFO][4175] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Namespace="calico-system" Pod="whisker-698cd4f4f8-qvkck" WorkloadEndpoint="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--698cd4f4f8--qvkck-eth0", GenerateName:"whisker-698cd4f4f8-", Namespace:"calico-system", SelfLink:"", UID:"c24e108b-2014-4b27-90f7-36b20c6e1f40", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"698cd4f4f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6", Pod:"whisker-698cd4f4f8-qvkck", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic19b60b6d32", MAC:"36:bc:52:d6:f2:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:13:59.717839 containerd[1622]: 2025-07-15 05:13:59.711 [INFO][4175] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" Namespace="calico-system" Pod="whisker-698cd4f4f8-qvkck" WorkloadEndpoint="localhost-k8s-whisker--698cd4f4f8--qvkck-eth0" Jul 15 05:13:59.838739 containerd[1622]: time="2025-07-15T05:13:59.838228206Z" level=info msg="connecting to shim c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6" address="unix:///run/containerd/s/2058ee3d185c7fdb24d744d44c0e481af65e418f042d27f652d2b909e255d672" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:13:59.860045 systemd[1]: Started cri-containerd-c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6.scope - libcontainer container c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6. Jul 15 05:13:59.871376 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:13:59.911165 containerd[1622]: time="2025-07-15T05:13:59.911140638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-698cd4f4f8-qvkck,Uid:c24e108b-2014-4b27-90f7-36b20c6e1f40,Namespace:calico-system,Attempt:0,} returns sandbox id \"c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6\"" Jul 15 05:13:59.995262 containerd[1622]: time="2025-07-15T05:13:59.995234959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 15 05:14:00.333000 kubelet[2951]: I0715 05:14:00.332969 2951 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8" path="/var/lib/kubelet/pods/70fda2d5-6b0b-4f67-9af3-2ff1dbc1d5a8/volumes" Jul 15 05:14:01.306915 containerd[1622]: time="2025-07-15T05:14:01.306843949Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:01.307310 containerd[1622]: time="2025-07-15T05:14:01.307271140Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Jul 15 05:14:01.307676 containerd[1622]: time="2025-07-15T05:14:01.307646922Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:01.308878 containerd[1622]: time="2025-07-15T05:14:01.308848273Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:01.309525 containerd[1622]: time="2025-07-15T05:14:01.309264093Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.314004747s" Jul 15 05:14:01.309525 containerd[1622]: time="2025-07-15T05:14:01.309283401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Jul 15 05:14:01.310959 containerd[1622]: time="2025-07-15T05:14:01.310937343Z" level=info msg="CreateContainer within sandbox \"c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 15 05:14:01.316821 containerd[1622]: time="2025-07-15T05:14:01.316793880Z" level=info msg="Container b0c747a8e3b10c441702a2d43454120327365436b51672a0298a120e8e31f093: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:01.317167 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1689346923.mount: Deactivated successfully. Jul 15 05:14:01.320999 containerd[1622]: time="2025-07-15T05:14:01.320960815Z" level=info msg="CreateContainer within sandbox \"c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"b0c747a8e3b10c441702a2d43454120327365436b51672a0298a120e8e31f093\"" Jul 15 05:14:01.321656 containerd[1622]: time="2025-07-15T05:14:01.321394429Z" level=info msg="StartContainer for \"b0c747a8e3b10c441702a2d43454120327365436b51672a0298a120e8e31f093\"" Jul 15 05:14:01.322019 containerd[1622]: time="2025-07-15T05:14:01.321991631Z" level=info msg="connecting to shim b0c747a8e3b10c441702a2d43454120327365436b51672a0298a120e8e31f093" address="unix:///run/containerd/s/2058ee3d185c7fdb24d744d44c0e481af65e418f042d27f652d2b909e255d672" protocol=ttrpc version=3 Jul 15 05:14:01.340024 systemd[1]: Started cri-containerd-b0c747a8e3b10c441702a2d43454120327365436b51672a0298a120e8e31f093.scope - libcontainer container b0c747a8e3b10c441702a2d43454120327365436b51672a0298a120e8e31f093. Jul 15 05:14:01.387574 containerd[1622]: time="2025-07-15T05:14:01.387542465Z" level=info msg="StartContainer for \"b0c747a8e3b10c441702a2d43454120327365436b51672a0298a120e8e31f093\" returns successfully" Jul 15 05:14:01.443230 containerd[1622]: time="2025-07-15T05:14:01.443199205Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 15 05:14:01.464057 systemd-networkd[1532]: calic19b60b6d32: Gained IPv6LL Jul 15 05:14:01.528060 systemd-networkd[1532]: vxlan.calico: Gained IPv6LL Jul 15 05:14:03.366858 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3234163548.mount: Deactivated successfully. Jul 15 05:14:03.374785 containerd[1622]: time="2025-07-15T05:14:03.374561475Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:03.375064 containerd[1622]: time="2025-07-15T05:14:03.374958241Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Jul 15 05:14:03.376454 containerd[1622]: time="2025-07-15T05:14:03.376434463Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:03.377846 containerd[1622]: time="2025-07-15T05:14:03.377813346Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:03.378700 containerd[1622]: time="2025-07-15T05:14:03.378387147Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 1.935164293s" Jul 15 05:14:03.378700 containerd[1622]: time="2025-07-15T05:14:03.378409584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Jul 15 05:14:03.380789 containerd[1622]: time="2025-07-15T05:14:03.380745990Z" level=info msg="CreateContainer within sandbox \"c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 15 05:14:03.385781 containerd[1622]: time="2025-07-15T05:14:03.384554382Z" level=info msg="Container 1fcb7354095120fca77d774b81a6a8f5e8d05d1cf30afee0574f670bdf634c9e: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:03.389174 containerd[1622]: time="2025-07-15T05:14:03.389146244Z" level=info msg="CreateContainer within sandbox \"c95388bd9655267ca47db2586ad5068b86f35f75082f30619a2f7714c86365e6\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1fcb7354095120fca77d774b81a6a8f5e8d05d1cf30afee0574f670bdf634c9e\"" Jul 15 05:14:03.389536 containerd[1622]: time="2025-07-15T05:14:03.389524508Z" level=info msg="StartContainer for \"1fcb7354095120fca77d774b81a6a8f5e8d05d1cf30afee0574f670bdf634c9e\"" Jul 15 05:14:03.396039 containerd[1622]: time="2025-07-15T05:14:03.391061331Z" level=info msg="connecting to shim 1fcb7354095120fca77d774b81a6a8f5e8d05d1cf30afee0574f670bdf634c9e" address="unix:///run/containerd/s/2058ee3d185c7fdb24d744d44c0e481af65e418f042d27f652d2b909e255d672" protocol=ttrpc version=3 Jul 15 05:14:03.416068 systemd[1]: Started cri-containerd-1fcb7354095120fca77d774b81a6a8f5e8d05d1cf30afee0574f670bdf634c9e.scope - libcontainer container 1fcb7354095120fca77d774b81a6a8f5e8d05d1cf30afee0574f670bdf634c9e. Jul 15 05:14:03.459026 containerd[1622]: time="2025-07-15T05:14:03.459000817Z" level=info msg="StartContainer for \"1fcb7354095120fca77d774b81a6a8f5e8d05d1cf30afee0574f670bdf634c9e\" returns successfully" Jul 15 05:14:04.319274 containerd[1622]: time="2025-07-15T05:14:04.319029670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5cx6x,Uid:42f78827-b3e3-4b25-9464-bcf35efc21f2,Namespace:calico-system,Attempt:0,}" Jul 15 05:14:04.320063 containerd[1622]: time="2025-07-15T05:14:04.319030022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5596dfd6cb-c84wc,Uid:77f25f3e-17e6-48a5-bd0c-d1425f534cb3,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:14:04.320063 containerd[1622]: time="2025-07-15T05:14:04.319944880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-849d984976-9xsth,Uid:21b748ad-b2a9-4a0e-8b89-475f2acae024,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:14:04.478169 systemd-networkd[1532]: cali5e81a1d106b: Link UP Jul 15 05:14:04.478780 systemd-networkd[1532]: cali5e81a1d106b: Gained carrier Jul 15 05:14:04.505206 containerd[1622]: 2025-07-15 05:14:04.364 [INFO][4467] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--5cx6x-eth0 csi-node-driver- calico-system 42f78827-b3e3-4b25-9464-bcf35efc21f2 701 0 2025-07-15 05:13:42 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-5cx6x eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5e81a1d106b [] [] }} ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Namespace="calico-system" Pod="csi-node-driver-5cx6x" WorkloadEndpoint="localhost-k8s-csi--node--driver--5cx6x-" Jul 15 05:14:04.505206 containerd[1622]: 2025-07-15 05:14:04.365 [INFO][4467] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Namespace="calico-system" Pod="csi-node-driver-5cx6x" WorkloadEndpoint="localhost-k8s-csi--node--driver--5cx6x-eth0" Jul 15 05:14:04.505206 containerd[1622]: 2025-07-15 05:14:04.399 [INFO][4497] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" HandleID="k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Workload="localhost-k8s-csi--node--driver--5cx6x-eth0" Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.400 [INFO][4497] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" HandleID="k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Workload="localhost-k8s-csi--node--driver--5cx6x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5730), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-5cx6x", "timestamp":"2025-07-15 05:14:04.399676927 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.400 [INFO][4497] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.400 [INFO][4497] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.400 [INFO][4497] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.406 [INFO][4497] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" host="localhost" Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.451 [INFO][4497] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.454 [INFO][4497] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.455 [INFO][4497] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.456 [INFO][4497] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:04.505734 containerd[1622]: 2025-07-15 05:14:04.456 [INFO][4497] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" host="localhost" Jul 15 05:14:04.506053 containerd[1622]: 2025-07-15 05:14:04.457 [INFO][4497] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533 Jul 15 05:14:04.506053 containerd[1622]: 2025-07-15 05:14:04.465 [INFO][4497] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" host="localhost" Jul 15 05:14:04.506053 containerd[1622]: 2025-07-15 05:14:04.473 [INFO][4497] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" host="localhost" Jul 15 05:14:04.506053 containerd[1622]: 2025-07-15 05:14:04.473 [INFO][4497] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" host="localhost" Jul 15 05:14:04.506053 containerd[1622]: 2025-07-15 05:14:04.473 [INFO][4497] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:04.506053 containerd[1622]: 2025-07-15 05:14:04.473 [INFO][4497] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" HandleID="k8s-pod-network.caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Workload="localhost-k8s-csi--node--driver--5cx6x-eth0" Jul 15 05:14:04.515672 containerd[1622]: 2025-07-15 05:14:04.475 [INFO][4467] cni-plugin/k8s.go 418: Populated endpoint ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Namespace="calico-system" Pod="csi-node-driver-5cx6x" WorkloadEndpoint="localhost-k8s-csi--node--driver--5cx6x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--5cx6x-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"42f78827-b3e3-4b25-9464-bcf35efc21f2", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-5cx6x", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5e81a1d106b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:04.515746 containerd[1622]: 2025-07-15 05:14:04.475 [INFO][4467] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Namespace="calico-system" Pod="csi-node-driver-5cx6x" WorkloadEndpoint="localhost-k8s-csi--node--driver--5cx6x-eth0" Jul 15 05:14:04.515746 containerd[1622]: 2025-07-15 05:14:04.475 [INFO][4467] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5e81a1d106b ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Namespace="calico-system" Pod="csi-node-driver-5cx6x" WorkloadEndpoint="localhost-k8s-csi--node--driver--5cx6x-eth0" Jul 15 05:14:04.515746 containerd[1622]: 2025-07-15 05:14:04.479 [INFO][4467] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Namespace="calico-system" Pod="csi-node-driver-5cx6x" WorkloadEndpoint="localhost-k8s-csi--node--driver--5cx6x-eth0" Jul 15 05:14:04.515817 containerd[1622]: 2025-07-15 05:14:04.479 [INFO][4467] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Namespace="calico-system" Pod="csi-node-driver-5cx6x" WorkloadEndpoint="localhost-k8s-csi--node--driver--5cx6x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--5cx6x-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"42f78827-b3e3-4b25-9464-bcf35efc21f2", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533", Pod:"csi-node-driver-5cx6x", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5e81a1d106b", MAC:"da:54:a9:be:95:d1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:04.515912 containerd[1622]: 2025-07-15 05:14:04.503 [INFO][4467] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" Namespace="calico-system" Pod="csi-node-driver-5cx6x" WorkloadEndpoint="localhost-k8s-csi--node--driver--5cx6x-eth0" Jul 15 05:14:04.604821 kubelet[2951]: I0715 05:14:04.575169 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-698cd4f4f8-qvkck" podStartSLOduration=3.081459487 podStartE2EDuration="6.502491464s" podCreationTimestamp="2025-07-15 05:13:58 +0000 UTC" firstStartedPulling="2025-07-15 05:13:59.957933552 +0000 UTC m=+37.737244410" lastFinishedPulling="2025-07-15 05:14:03.37896553 +0000 UTC m=+41.158276387" observedRunningTime="2025-07-15 05:14:03.597431134 +0000 UTC m=+41.376742013" watchObservedRunningTime="2025-07-15 05:14:04.502491464 +0000 UTC m=+42.281802325" Jul 15 05:14:04.632636 systemd-networkd[1532]: calidab1d4645df: Link UP Jul 15 05:14:04.633659 systemd-networkd[1532]: calidab1d4645df: Gained carrier Jul 15 05:14:04.664468 containerd[1622]: 2025-07-15 05:14:04.369 [INFO][4465] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--849d984976--9xsth-eth0 calico-apiserver-849d984976- calico-apiserver 21b748ad-b2a9-4a0e-8b89-475f2acae024 814 0 2025-07-15 05:13:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:849d984976 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-849d984976-9xsth eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidab1d4645df [] [] }} ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-9xsth" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-" Jul 15 05:14:04.664468 containerd[1622]: 2025-07-15 05:14:04.369 [INFO][4465] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-9xsth" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:14:04.664468 containerd[1622]: 2025-07-15 05:14:04.399 [INFO][4503] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.400 [INFO][4503] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f870), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-849d984976-9xsth", "timestamp":"2025-07-15 05:14:04.399259847 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.400 [INFO][4503] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.473 [INFO][4503] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.473 [INFO][4503] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.507 [INFO][4503] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" host="localhost" Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.521 [INFO][4503] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.556 [INFO][4503] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.558 [INFO][4503] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.559 [INFO][4503] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:04.664695 containerd[1622]: 2025-07-15 05:14:04.559 [INFO][4503] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" host="localhost" Jul 15 05:14:04.666436 containerd[1622]: 2025-07-15 05:14:04.560 [INFO][4503] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458 Jul 15 05:14:04.666436 containerd[1622]: 2025-07-15 05:14:04.594 [INFO][4503] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" host="localhost" Jul 15 05:14:04.666436 containerd[1622]: 2025-07-15 05:14:04.628 [INFO][4503] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" host="localhost" Jul 15 05:14:04.666436 containerd[1622]: 2025-07-15 05:14:04.628 [INFO][4503] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" host="localhost" Jul 15 05:14:04.666436 containerd[1622]: 2025-07-15 05:14:04.628 [INFO][4503] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:04.666436 containerd[1622]: 2025-07-15 05:14:04.628 [INFO][4503] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:14:04.666789 containerd[1622]: 2025-07-15 05:14:04.630 [INFO][4465] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-9xsth" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--849d984976--9xsth-eth0", GenerateName:"calico-apiserver-849d984976-", Namespace:"calico-apiserver", SelfLink:"", UID:"21b748ad-b2a9-4a0e-8b89-475f2acae024", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"849d984976", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-849d984976-9xsth", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidab1d4645df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:04.667416 containerd[1622]: 2025-07-15 05:14:04.630 [INFO][4465] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-9xsth" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:14:04.667416 containerd[1622]: 2025-07-15 05:14:04.630 [INFO][4465] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidab1d4645df ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-9xsth" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:14:04.667416 containerd[1622]: 2025-07-15 05:14:04.634 [INFO][4465] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-9xsth" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:14:04.667499 containerd[1622]: 2025-07-15 05:14:04.634 [INFO][4465] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-9xsth" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--849d984976--9xsth-eth0", GenerateName:"calico-apiserver-849d984976-", Namespace:"calico-apiserver", SelfLink:"", UID:"21b748ad-b2a9-4a0e-8b89-475f2acae024", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"849d984976", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458", Pod:"calico-apiserver-849d984976-9xsth", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidab1d4645df", MAC:"fa:cf:06:d2:a1:7f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:04.667556 containerd[1622]: 2025-07-15 05:14:04.660 [INFO][4465] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-9xsth" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:14:04.683615 containerd[1622]: time="2025-07-15T05:14:04.683556550Z" level=info msg="connecting to shim caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533" address="unix:///run/containerd/s/74141b00043189a4a640f59ef7cb1e31e2d39ce9569660251a9b42e3bbf237a9" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:04.698129 systemd-networkd[1532]: calia2eda2a87cf: Link UP Jul 15 05:14:04.699408 systemd-networkd[1532]: calia2eda2a87cf: Gained carrier Jul 15 05:14:04.714406 containerd[1622]: 2025-07-15 05:14:04.384 [INFO][4476] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0 calico-apiserver-5596dfd6cb- calico-apiserver 77f25f3e-17e6-48a5-bd0c-d1425f534cb3 818 0 2025-07-15 05:13:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5596dfd6cb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5596dfd6cb-c84wc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia2eda2a87cf [] [] }} ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-c84wc" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-" Jul 15 05:14:04.714406 containerd[1622]: 2025-07-15 05:14:04.384 [INFO][4476] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-c84wc" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" Jul 15 05:14:04.714406 containerd[1622]: 2025-07-15 05:14:04.451 [INFO][4512] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" HandleID="k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Workload="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.451 [INFO][4512] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" HandleID="k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Workload="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad4a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5596dfd6cb-c84wc", "timestamp":"2025-07-15 05:14:04.451073522 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.451 [INFO][4512] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.628 [INFO][4512] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.628 [INFO][4512] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.640 [INFO][4512] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" host="localhost" Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.660 [INFO][4512] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.666 [INFO][4512] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.668 [INFO][4512] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.674 [INFO][4512] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:04.714583 containerd[1622]: 2025-07-15 05:14:04.674 [INFO][4512] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" host="localhost" Jul 15 05:14:04.714776 containerd[1622]: 2025-07-15 05:14:04.675 [INFO][4512] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a Jul 15 05:14:04.714776 containerd[1622]: 2025-07-15 05:14:04.679 [INFO][4512] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" host="localhost" Jul 15 05:14:04.714776 containerd[1622]: 2025-07-15 05:14:04.685 [INFO][4512] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" host="localhost" Jul 15 05:14:04.714776 containerd[1622]: 2025-07-15 05:14:04.685 [INFO][4512] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" host="localhost" Jul 15 05:14:04.714776 containerd[1622]: 2025-07-15 05:14:04.685 [INFO][4512] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:04.714776 containerd[1622]: 2025-07-15 05:14:04.686 [INFO][4512] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" HandleID="k8s-pod-network.db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Workload="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" Jul 15 05:14:04.714890 containerd[1622]: 2025-07-15 05:14:04.694 [INFO][4476] cni-plugin/k8s.go 418: Populated endpoint ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-c84wc" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0", GenerateName:"calico-apiserver-5596dfd6cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"77f25f3e-17e6-48a5-bd0c-d1425f534cb3", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5596dfd6cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5596dfd6cb-c84wc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia2eda2a87cf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:04.716046 containerd[1622]: 2025-07-15 05:14:04.694 [INFO][4476] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-c84wc" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" Jul 15 05:14:04.716046 containerd[1622]: 2025-07-15 05:14:04.694 [INFO][4476] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia2eda2a87cf ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-c84wc" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" Jul 15 05:14:04.716046 containerd[1622]: 2025-07-15 05:14:04.699 [INFO][4476] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-c84wc" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" Jul 15 05:14:04.716446 containerd[1622]: 2025-07-15 05:14:04.700 [INFO][4476] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-c84wc" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0", GenerateName:"calico-apiserver-5596dfd6cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"77f25f3e-17e6-48a5-bd0c-d1425f534cb3", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5596dfd6cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a", Pod:"calico-apiserver-5596dfd6cb-c84wc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia2eda2a87cf", MAC:"7e:7f:ce:55:01:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:04.716508 containerd[1622]: 2025-07-15 05:14:04.711 [INFO][4476] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-c84wc" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--c84wc-eth0" Jul 15 05:14:04.725229 systemd[1]: Started cri-containerd-caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533.scope - libcontainer container caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533. Jul 15 05:14:04.738549 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:04.752886 containerd[1622]: time="2025-07-15T05:14:04.752862021Z" level=info msg="connecting to shim c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" address="unix:///run/containerd/s/edb69ba23473d27bc24c96d78413e4328620ba69ad034abdd5ed6ed95dd2ca9f" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:04.768463 systemd[1]: Started cri-containerd-c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458.scope - libcontainer container c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458. Jul 15 05:14:04.783578 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:04.786914 containerd[1622]: time="2025-07-15T05:14:04.786869085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5cx6x,Uid:42f78827-b3e3-4b25-9464-bcf35efc21f2,Namespace:calico-system,Attempt:0,} returns sandbox id \"caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533\"" Jul 15 05:14:04.815085 containerd[1622]: time="2025-07-15T05:14:04.814971722Z" level=info msg="connecting to shim db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a" address="unix:///run/containerd/s/295a2013b3fb428e017fa31fdb52a64fa77d25194d5b1c49533da1e77127d279" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:04.828918 containerd[1622]: time="2025-07-15T05:14:04.828792222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-849d984976-9xsth,Uid:21b748ad-b2a9-4a0e-8b89-475f2acae024,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\"" Jul 15 05:14:04.833134 systemd[1]: Started cri-containerd-db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a.scope - libcontainer container db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a. Jul 15 05:14:04.844994 containerd[1622]: time="2025-07-15T05:14:04.844065695Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 15 05:14:04.853103 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:04.879816 containerd[1622]: time="2025-07-15T05:14:04.879768251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5596dfd6cb-c84wc,Uid:77f25f3e-17e6-48a5-bd0c-d1425f534cb3,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a\"" Jul 15 05:14:05.318233 containerd[1622]: time="2025-07-15T05:14:05.317839313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lf4wk,Uid:1f9d996a-a917-4f78-a806-a2b8173db749,Namespace:kube-system,Attempt:0,}" Jul 15 05:14:05.318840 containerd[1622]: time="2025-07-15T05:14:05.317868107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-8nvjp,Uid:f6387e58-6994-4983-b232-ec13cc1ee6a6,Namespace:calico-system,Attempt:0,}" Jul 15 05:14:05.426361 systemd-networkd[1532]: cali519f63d01d4: Link UP Jul 15 05:14:05.427403 systemd-networkd[1532]: cali519f63d01d4: Gained carrier Jul 15 05:14:05.461749 containerd[1622]: 2025-07-15 05:14:05.362 [INFO][4700] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0 goldmane-58fd7646b9- calico-system f6387e58-6994-4983-b232-ec13cc1ee6a6 815 0 2025-07-15 05:13:41 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-58fd7646b9-8nvjp eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali519f63d01d4 [] [] }} ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Namespace="calico-system" Pod="goldmane-58fd7646b9-8nvjp" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--8nvjp-" Jul 15 05:14:05.461749 containerd[1622]: 2025-07-15 05:14:05.362 [INFO][4700] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Namespace="calico-system" Pod="goldmane-58fd7646b9-8nvjp" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" Jul 15 05:14:05.461749 containerd[1622]: 2025-07-15 05:14:05.385 [INFO][4722] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" HandleID="k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Workload="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.385 [INFO][4722] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" HandleID="k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Workload="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56c0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-58fd7646b9-8nvjp", "timestamp":"2025-07-15 05:14:05.385334442 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.385 [INFO][4722] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.385 [INFO][4722] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.385 [INFO][4722] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.390 [INFO][4722] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" host="localhost" Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.392 [INFO][4722] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.395 [INFO][4722] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.396 [INFO][4722] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.399 [INFO][4722] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:05.462333 containerd[1622]: 2025-07-15 05:14:05.399 [INFO][4722] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" host="localhost" Jul 15 05:14:05.462739 containerd[1622]: 2025-07-15 05:14:05.400 [INFO][4722] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586 Jul 15 05:14:05.462739 containerd[1622]: 2025-07-15 05:14:05.402 [INFO][4722] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" host="localhost" Jul 15 05:14:05.462739 containerd[1622]: 2025-07-15 05:14:05.422 [INFO][4722] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" host="localhost" Jul 15 05:14:05.462739 containerd[1622]: 2025-07-15 05:14:05.422 [INFO][4722] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" host="localhost" Jul 15 05:14:05.462739 containerd[1622]: 2025-07-15 05:14:05.422 [INFO][4722] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:05.462739 containerd[1622]: 2025-07-15 05:14:05.422 [INFO][4722] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" HandleID="k8s-pod-network.ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Workload="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" Jul 15 05:14:05.462866 containerd[1622]: 2025-07-15 05:14:05.424 [INFO][4700] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Namespace="calico-system" Pod="goldmane-58fd7646b9-8nvjp" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"f6387e58-6994-4983-b232-ec13cc1ee6a6", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-58fd7646b9-8nvjp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali519f63d01d4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:05.462866 containerd[1622]: 2025-07-15 05:14:05.424 [INFO][4700] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Namespace="calico-system" Pod="goldmane-58fd7646b9-8nvjp" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" Jul 15 05:14:05.463288 containerd[1622]: 2025-07-15 05:14:05.424 [INFO][4700] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali519f63d01d4 ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Namespace="calico-system" Pod="goldmane-58fd7646b9-8nvjp" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" Jul 15 05:14:05.463288 containerd[1622]: 2025-07-15 05:14:05.436 [INFO][4700] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Namespace="calico-system" Pod="goldmane-58fd7646b9-8nvjp" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" Jul 15 05:14:05.463335 containerd[1622]: 2025-07-15 05:14:05.437 [INFO][4700] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Namespace="calico-system" Pod="goldmane-58fd7646b9-8nvjp" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"f6387e58-6994-4983-b232-ec13cc1ee6a6", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586", Pod:"goldmane-58fd7646b9-8nvjp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali519f63d01d4", MAC:"be:30:17:77:7f:23", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:05.463392 containerd[1622]: 2025-07-15 05:14:05.459 [INFO][4700] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" Namespace="calico-system" Pod="goldmane-58fd7646b9-8nvjp" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--8nvjp-eth0" Jul 15 05:14:05.553945 systemd-networkd[1532]: calibb15da1f978: Link UP Jul 15 05:14:05.554704 systemd-networkd[1532]: calibb15da1f978: Gained carrier Jul 15 05:14:05.571961 containerd[1622]: 2025-07-15 05:14:05.358 [INFO][4697] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0 coredns-7c65d6cfc9- kube-system 1f9d996a-a917-4f78-a806-a2b8173db749 813 0 2025-07-15 05:13:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-lf4wk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibb15da1f978 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lf4wk" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lf4wk-" Jul 15 05:14:05.571961 containerd[1622]: 2025-07-15 05:14:05.358 [INFO][4697] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lf4wk" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" Jul 15 05:14:05.571961 containerd[1622]: 2025-07-15 05:14:05.399 [INFO][4720] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" HandleID="k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Workload="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.399 [INFO][4720] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" HandleID="k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Workload="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d1860), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-lf4wk", "timestamp":"2025-07-15 05:14:05.39945717 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.399 [INFO][4720] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.422 [INFO][4720] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.422 [INFO][4720] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.491 [INFO][4720] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" host="localhost" Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.496 [INFO][4720] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.524 [INFO][4720] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.525 [INFO][4720] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.527 [INFO][4720] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:05.572602 containerd[1622]: 2025-07-15 05:14:05.527 [INFO][4720] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" host="localhost" Jul 15 05:14:05.573182 containerd[1622]: 2025-07-15 05:14:05.528 [INFO][4720] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1 Jul 15 05:14:05.573182 containerd[1622]: 2025-07-15 05:14:05.533 [INFO][4720] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" host="localhost" Jul 15 05:14:05.573182 containerd[1622]: 2025-07-15 05:14:05.547 [INFO][4720] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" host="localhost" Jul 15 05:14:05.573182 containerd[1622]: 2025-07-15 05:14:05.547 [INFO][4720] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" host="localhost" Jul 15 05:14:05.573182 containerd[1622]: 2025-07-15 05:14:05.547 [INFO][4720] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:05.573182 containerd[1622]: 2025-07-15 05:14:05.547 [INFO][4720] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" HandleID="k8s-pod-network.a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Workload="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" Jul 15 05:14:05.573294 containerd[1622]: 2025-07-15 05:14:05.550 [INFO][4697] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lf4wk" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1f9d996a-a917-4f78-a806-a2b8173db749", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-lf4wk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibb15da1f978", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:05.573797 containerd[1622]: 2025-07-15 05:14:05.550 [INFO][4697] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lf4wk" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" Jul 15 05:14:05.573797 containerd[1622]: 2025-07-15 05:14:05.550 [INFO][4697] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibb15da1f978 ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lf4wk" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" Jul 15 05:14:05.573797 containerd[1622]: 2025-07-15 05:14:05.554 [INFO][4697] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lf4wk" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" Jul 15 05:14:05.573861 containerd[1622]: 2025-07-15 05:14:05.555 [INFO][4697] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lf4wk" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1f9d996a-a917-4f78-a806-a2b8173db749", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1", Pod:"coredns-7c65d6cfc9-lf4wk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibb15da1f978", MAC:"2a:c9:f5:5b:ae:27", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:05.573861 containerd[1622]: 2025-07-15 05:14:05.568 [INFO][4697] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lf4wk" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lf4wk-eth0" Jul 15 05:14:05.580158 containerd[1622]: time="2025-07-15T05:14:05.580089072Z" level=info msg="connecting to shim ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586" address="unix:///run/containerd/s/ac746f37401f8f5b51ca7f122b4e215182aeba796c8dd243f799c68cb9f6934f" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:05.607724 containerd[1622]: time="2025-07-15T05:14:05.607659246Z" level=info msg="connecting to shim a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1" address="unix:///run/containerd/s/f7ae3f93d16ccaf32ec51dc1a03005e861377c87247c3b8bf45bea3393715d78" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:05.615591 systemd[1]: Started cri-containerd-ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586.scope - libcontainer container ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586. Jul 15 05:14:05.631118 systemd[1]: Started cri-containerd-a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1.scope - libcontainer container a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1. Jul 15 05:14:05.636784 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:05.640168 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:05.689906 containerd[1622]: time="2025-07-15T05:14:05.689873416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lf4wk,Uid:1f9d996a-a917-4f78-a806-a2b8173db749,Namespace:kube-system,Attempt:0,} returns sandbox id \"a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1\"" Jul 15 05:14:05.691169 containerd[1622]: time="2025-07-15T05:14:05.689971422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-8nvjp,Uid:f6387e58-6994-4983-b232-ec13cc1ee6a6,Namespace:calico-system,Attempt:0,} returns sandbox id \"ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586\"" Jul 15 05:14:05.692248 containerd[1622]: time="2025-07-15T05:14:05.692152339Z" level=info msg="CreateContainer within sandbox \"a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:14:05.727926 containerd[1622]: time="2025-07-15T05:14:05.727869069Z" level=info msg="Container 24cc2d1e60a88c943ee36a078cb7ce7a43afd978ab0e697889677cca226c7b90: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:05.730458 containerd[1622]: time="2025-07-15T05:14:05.730438293Z" level=info msg="CreateContainer within sandbox \"a8c486551fe15d53c36814dd0477d22a83907c0289eef91cea09547bf776c5d1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"24cc2d1e60a88c943ee36a078cb7ce7a43afd978ab0e697889677cca226c7b90\"" Jul 15 05:14:05.730893 containerd[1622]: time="2025-07-15T05:14:05.730869336Z" level=info msg="StartContainer for \"24cc2d1e60a88c943ee36a078cb7ce7a43afd978ab0e697889677cca226c7b90\"" Jul 15 05:14:05.731525 containerd[1622]: time="2025-07-15T05:14:05.731507581Z" level=info msg="connecting to shim 24cc2d1e60a88c943ee36a078cb7ce7a43afd978ab0e697889677cca226c7b90" address="unix:///run/containerd/s/f7ae3f93d16ccaf32ec51dc1a03005e861377c87247c3b8bf45bea3393715d78" protocol=ttrpc version=3 Jul 15 05:14:05.749068 systemd[1]: Started cri-containerd-24cc2d1e60a88c943ee36a078cb7ce7a43afd978ab0e697889677cca226c7b90.scope - libcontainer container 24cc2d1e60a88c943ee36a078cb7ce7a43afd978ab0e697889677cca226c7b90. Jul 15 05:14:05.779619 containerd[1622]: time="2025-07-15T05:14:05.779562977Z" level=info msg="StartContainer for \"24cc2d1e60a88c943ee36a078cb7ce7a43afd978ab0e697889677cca226c7b90\" returns successfully" Jul 15 05:14:05.816045 systemd-networkd[1532]: calidab1d4645df: Gained IPv6LL Jul 15 05:14:06.072054 systemd-networkd[1532]: calia2eda2a87cf: Gained IPv6LL Jul 15 05:14:06.252923 containerd[1622]: time="2025-07-15T05:14:06.252864749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:06.254101 containerd[1622]: time="2025-07-15T05:14:06.254075456Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Jul 15 05:14:06.255777 containerd[1622]: time="2025-07-15T05:14:06.255752972Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:06.256724 containerd[1622]: time="2025-07-15T05:14:06.256705184Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:06.257195 containerd[1622]: time="2025-07-15T05:14:06.257174798Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.412990667s" Jul 15 05:14:06.257195 containerd[1622]: time="2025-07-15T05:14:06.257192548Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Jul 15 05:14:06.259029 containerd[1622]: time="2025-07-15T05:14:06.258958494Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:14:06.260985 containerd[1622]: time="2025-07-15T05:14:06.260927638Z" level=info msg="CreateContainer within sandbox \"caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 15 05:14:06.294818 containerd[1622]: time="2025-07-15T05:14:06.294746834Z" level=info msg="Container fdb8859f854ca4f022617274dc0ee88d3ccbdc3543c19e088a36a248e734bb97: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:06.318471 containerd[1622]: time="2025-07-15T05:14:06.318427854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2kxrm,Uid:eb8c1a88-dd9b-4100-b3af-e3f0a218078d,Namespace:kube-system,Attempt:0,}" Jul 15 05:14:06.324428 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1197971833.mount: Deactivated successfully. Jul 15 05:14:06.328022 systemd-networkd[1532]: cali5e81a1d106b: Gained IPv6LL Jul 15 05:14:06.360191 containerd[1622]: time="2025-07-15T05:14:06.360159750Z" level=info msg="CreateContainer within sandbox \"caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"fdb8859f854ca4f022617274dc0ee88d3ccbdc3543c19e088a36a248e734bb97\"" Jul 15 05:14:06.361399 containerd[1622]: time="2025-07-15T05:14:06.361301230Z" level=info msg="StartContainer for \"fdb8859f854ca4f022617274dc0ee88d3ccbdc3543c19e088a36a248e734bb97\"" Jul 15 05:14:06.366973 containerd[1622]: time="2025-07-15T05:14:06.366949739Z" level=info msg="connecting to shim fdb8859f854ca4f022617274dc0ee88d3ccbdc3543c19e088a36a248e734bb97" address="unix:///run/containerd/s/74141b00043189a4a640f59ef7cb1e31e2d39ce9569660251a9b42e3bbf237a9" protocol=ttrpc version=3 Jul 15 05:14:06.394023 systemd[1]: Started cri-containerd-fdb8859f854ca4f022617274dc0ee88d3ccbdc3543c19e088a36a248e734bb97.scope - libcontainer container fdb8859f854ca4f022617274dc0ee88d3ccbdc3543c19e088a36a248e734bb97. Jul 15 05:14:06.487233 containerd[1622]: time="2025-07-15T05:14:06.487171903Z" level=info msg="StartContainer for \"fdb8859f854ca4f022617274dc0ee88d3ccbdc3543c19e088a36a248e734bb97\" returns successfully" Jul 15 05:14:06.520066 systemd-networkd[1532]: cali519f63d01d4: Gained IPv6LL Jul 15 05:14:06.532196 systemd-networkd[1532]: cali8378aa3c29e: Link UP Jul 15 05:14:06.532678 systemd-networkd[1532]: cali8378aa3c29e: Gained carrier Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.434 [INFO][4878] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0 coredns-7c65d6cfc9- kube-system eb8c1a88-dd9b-4100-b3af-e3f0a218078d 812 0 2025-07-15 05:13:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-2kxrm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8378aa3c29e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2kxrm" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2kxrm-" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.434 [INFO][4878] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2kxrm" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.464 [INFO][4910] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" HandleID="k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Workload="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.464 [INFO][4910] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" HandleID="k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Workload="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f250), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-2kxrm", "timestamp":"2025-07-15 05:14:06.464395532 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.464 [INFO][4910] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.464 [INFO][4910] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.464 [INFO][4910] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.469 [INFO][4910] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.472 [INFO][4910] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.481 [INFO][4910] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.488 [INFO][4910] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.489 [INFO][4910] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.489 [INFO][4910] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.490 [INFO][4910] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.509 [INFO][4910] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.528 [INFO][4910] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.528 [INFO][4910] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" host="localhost" Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.528 [INFO][4910] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:06.559595 containerd[1622]: 2025-07-15 05:14:06.528 [INFO][4910] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" HandleID="k8s-pod-network.20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Workload="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" Jul 15 05:14:06.560932 containerd[1622]: 2025-07-15 05:14:06.530 [INFO][4878] cni-plugin/k8s.go 418: Populated endpoint ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2kxrm" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"eb8c1a88-dd9b-4100-b3af-e3f0a218078d", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-2kxrm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8378aa3c29e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:06.560932 containerd[1622]: 2025-07-15 05:14:06.530 [INFO][4878] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2kxrm" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" Jul 15 05:14:06.560932 containerd[1622]: 2025-07-15 05:14:06.530 [INFO][4878] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8378aa3c29e ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2kxrm" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" Jul 15 05:14:06.560932 containerd[1622]: 2025-07-15 05:14:06.533 [INFO][4878] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2kxrm" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" Jul 15 05:14:06.560932 containerd[1622]: 2025-07-15 05:14:06.533 [INFO][4878] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2kxrm" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"eb8c1a88-dd9b-4100-b3af-e3f0a218078d", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec", Pod:"coredns-7c65d6cfc9-2kxrm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8378aa3c29e", MAC:"82:04:2b:26:88:7a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:06.560932 containerd[1622]: 2025-07-15 05:14:06.556 [INFO][4878] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2kxrm" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2kxrm-eth0" Jul 15 05:14:06.673063 kubelet[2951]: I0715 05:14:06.672992 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-lf4wk" podStartSLOduration=37.672946755 podStartE2EDuration="37.672946755s" podCreationTimestamp="2025-07-15 05:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:14:06.672676091 +0000 UTC m=+44.451986956" watchObservedRunningTime="2025-07-15 05:14:06.672946755 +0000 UTC m=+44.452257617" Jul 15 05:14:07.032156 systemd-networkd[1532]: calibb15da1f978: Gained IPv6LL Jul 15 05:14:07.471191 containerd[1622]: time="2025-07-15T05:14:07.318620666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-849d984976-ml4gq,Uid:20e10742-00f4-4932-9581-20e919e16af1,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:14:07.731073 containerd[1622]: time="2025-07-15T05:14:07.730810818Z" level=info msg="connecting to shim 20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec" address="unix:///run/containerd/s/c71155bd8b411603f29b170d3511388bfd7923389a25524742483a78bc73eb95" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:07.772061 systemd[1]: Started cri-containerd-20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec.scope - libcontainer container 20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec. Jul 15 05:14:07.784139 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:07.819736 containerd[1622]: time="2025-07-15T05:14:07.819705648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2kxrm,Uid:eb8c1a88-dd9b-4100-b3af-e3f0a218078d,Namespace:kube-system,Attempt:0,} returns sandbox id \"20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec\"" Jul 15 05:14:07.835136 containerd[1622]: time="2025-07-15T05:14:07.835107339Z" level=info msg="CreateContainer within sandbox \"20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:14:07.859574 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount474781555.mount: Deactivated successfully. Jul 15 05:14:07.860398 containerd[1622]: time="2025-07-15T05:14:07.859756411Z" level=info msg="Container 4e98af0370ff2b55fa35371e37893cf980aa892f1d7a2da993929194f8ab95bc: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:07.866443 containerd[1622]: time="2025-07-15T05:14:07.866414274Z" level=info msg="CreateContainer within sandbox \"20621bb21cb121b3e22af0dda460b5ce8dd58498f642ba5036d466d0857689ec\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4e98af0370ff2b55fa35371e37893cf980aa892f1d7a2da993929194f8ab95bc\"" Jul 15 05:14:07.867353 containerd[1622]: time="2025-07-15T05:14:07.866893337Z" level=info msg="StartContainer for \"4e98af0370ff2b55fa35371e37893cf980aa892f1d7a2da993929194f8ab95bc\"" Jul 15 05:14:07.868640 containerd[1622]: time="2025-07-15T05:14:07.868533609Z" level=info msg="connecting to shim 4e98af0370ff2b55fa35371e37893cf980aa892f1d7a2da993929194f8ab95bc" address="unix:///run/containerd/s/c71155bd8b411603f29b170d3511388bfd7923389a25524742483a78bc73eb95" protocol=ttrpc version=3 Jul 15 05:14:07.890071 systemd[1]: Started cri-containerd-4e98af0370ff2b55fa35371e37893cf980aa892f1d7a2da993929194f8ab95bc.scope - libcontainer container 4e98af0370ff2b55fa35371e37893cf980aa892f1d7a2da993929194f8ab95bc. Jul 15 05:14:07.929280 systemd-networkd[1532]: cali8378aa3c29e: Gained IPv6LL Jul 15 05:14:07.951035 systemd-networkd[1532]: calia8cef0ffee4: Link UP Jul 15 05:14:07.951138 systemd-networkd[1532]: calia8cef0ffee4: Gained carrier Jul 15 05:14:07.973534 containerd[1622]: time="2025-07-15T05:14:07.953789652Z" level=info msg="StartContainer for \"4e98af0370ff2b55fa35371e37893cf980aa892f1d7a2da993929194f8ab95bc\" returns successfully" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.795 [INFO][4938] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0 calico-apiserver-849d984976- calico-apiserver 20e10742-00f4-4932-9581-20e919e16af1 817 0 2025-07-15 05:13:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:849d984976 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-849d984976-ml4gq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia8cef0ffee4 [] [] }} ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-ml4gq" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.795 [INFO][4938] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-ml4gq" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.828 [INFO][4989] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.828 [INFO][4989] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-849d984976-ml4gq", "timestamp":"2025-07-15 05:14:07.828068121 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.828 [INFO][4989] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.828 [INFO][4989] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.828 [INFO][4989] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.853 [INFO][4989] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.865 [INFO][4989] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.885 [INFO][4989] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.892 [INFO][4989] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.895 [INFO][4989] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.895 [INFO][4989] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.897 [INFO][4989] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912 Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.925 [INFO][4989] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.947 [INFO][4989] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.947 [INFO][4989] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" host="localhost" Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.947 [INFO][4989] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:08.022456 containerd[1622]: 2025-07-15 05:14:07.947 [INFO][4989] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:08.025123 containerd[1622]: 2025-07-15 05:14:07.948 [INFO][4938] cni-plugin/k8s.go 418: Populated endpoint ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-ml4gq" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0", GenerateName:"calico-apiserver-849d984976-", Namespace:"calico-apiserver", SelfLink:"", UID:"20e10742-00f4-4932-9581-20e919e16af1", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"849d984976", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-849d984976-ml4gq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia8cef0ffee4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:08.025123 containerd[1622]: 2025-07-15 05:14:07.948 [INFO][4938] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-ml4gq" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:08.025123 containerd[1622]: 2025-07-15 05:14:07.948 [INFO][4938] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia8cef0ffee4 ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-ml4gq" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:08.025123 containerd[1622]: 2025-07-15 05:14:07.950 [INFO][4938] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-ml4gq" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:08.025123 containerd[1622]: 2025-07-15 05:14:07.950 [INFO][4938] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-ml4gq" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0", GenerateName:"calico-apiserver-849d984976-", Namespace:"calico-apiserver", SelfLink:"", UID:"20e10742-00f4-4932-9581-20e919e16af1", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"849d984976", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912", Pod:"calico-apiserver-849d984976-ml4gq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia8cef0ffee4", MAC:"ba:42:93:3f:61:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:08.025123 containerd[1622]: 2025-07-15 05:14:08.020 [INFO][4938] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Namespace="calico-apiserver" Pod="calico-apiserver-849d984976-ml4gq" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:08.090563 containerd[1622]: time="2025-07-15T05:14:08.090135730Z" level=info msg="connecting to shim 962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" address="unix:///run/containerd/s/aaa30c4364e9ecc10d5163fd939f9325e4a1cced038ae4fda6d98dc407fb37ed" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:08.117009 systemd[1]: Started cri-containerd-962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912.scope - libcontainer container 962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912. Jul 15 05:14:08.124999 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:08.152237 containerd[1622]: time="2025-07-15T05:14:08.152186987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-849d984976-ml4gq,Uid:20e10742-00f4-4932-9581-20e919e16af1,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\"" Jul 15 05:14:08.318737 containerd[1622]: time="2025-07-15T05:14:08.318577213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-64cfc7b9f4-zc956,Uid:0e8612d6-0455-4a88-b312-2f91253d889c,Namespace:calico-system,Attempt:0,}" Jul 15 05:14:08.431823 systemd-networkd[1532]: cali71ab6322162: Link UP Jul 15 05:14:08.432557 systemd-networkd[1532]: cali71ab6322162: Gained carrier Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.374 [INFO][5088] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0 calico-kube-controllers-64cfc7b9f4- calico-system 0e8612d6-0455-4a88-b312-2f91253d889c 808 0 2025-07-15 05:13:42 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:64cfc7b9f4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-64cfc7b9f4-zc956 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali71ab6322162 [] [] }} ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Namespace="calico-system" Pod="calico-kube-controllers-64cfc7b9f4-zc956" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.374 [INFO][5088] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Namespace="calico-system" Pod="calico-kube-controllers-64cfc7b9f4-zc956" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.394 [INFO][5101] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" HandleID="k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Workload="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.394 [INFO][5101] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" HandleID="k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Workload="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-64cfc7b9f4-zc956", "timestamp":"2025-07-15 05:14:08.394032844 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.394 [INFO][5101] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.394 [INFO][5101] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.394 [INFO][5101] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.400 [INFO][5101] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.403 [INFO][5101] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.410 [INFO][5101] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.411 [INFO][5101] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.413 [INFO][5101] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.413 [INFO][5101] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.414 [INFO][5101] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.418 [INFO][5101] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.428 [INFO][5101] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.428 [INFO][5101] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" host="localhost" Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.428 [INFO][5101] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:08.458124 containerd[1622]: 2025-07-15 05:14:08.428 [INFO][5101] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" HandleID="k8s-pod-network.8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Workload="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" Jul 15 05:14:08.475330 containerd[1622]: 2025-07-15 05:14:08.430 [INFO][5088] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Namespace="calico-system" Pod="calico-kube-controllers-64cfc7b9f4-zc956" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0", GenerateName:"calico-kube-controllers-64cfc7b9f4-", Namespace:"calico-system", SelfLink:"", UID:"0e8612d6-0455-4a88-b312-2f91253d889c", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"64cfc7b9f4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-64cfc7b9f4-zc956", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali71ab6322162", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:08.475330 containerd[1622]: 2025-07-15 05:14:08.430 [INFO][5088] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Namespace="calico-system" Pod="calico-kube-controllers-64cfc7b9f4-zc956" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" Jul 15 05:14:08.475330 containerd[1622]: 2025-07-15 05:14:08.430 [INFO][5088] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali71ab6322162 ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Namespace="calico-system" Pod="calico-kube-controllers-64cfc7b9f4-zc956" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" Jul 15 05:14:08.475330 containerd[1622]: 2025-07-15 05:14:08.432 [INFO][5088] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Namespace="calico-system" Pod="calico-kube-controllers-64cfc7b9f4-zc956" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" Jul 15 05:14:08.475330 containerd[1622]: 2025-07-15 05:14:08.432 [INFO][5088] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Namespace="calico-system" Pod="calico-kube-controllers-64cfc7b9f4-zc956" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0", GenerateName:"calico-kube-controllers-64cfc7b9f4-", Namespace:"calico-system", SelfLink:"", UID:"0e8612d6-0455-4a88-b312-2f91253d889c", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 13, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"64cfc7b9f4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d", Pod:"calico-kube-controllers-64cfc7b9f4-zc956", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali71ab6322162", MAC:"c2:da:45:c0:07:a9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:08.475330 containerd[1622]: 2025-07-15 05:14:08.456 [INFO][5088] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" Namespace="calico-system" Pod="calico-kube-controllers-64cfc7b9f4-zc956" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--64cfc7b9f4--zc956-eth0" Jul 15 05:14:08.550233 containerd[1622]: time="2025-07-15T05:14:08.550142793Z" level=info msg="connecting to shim 8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d" address="unix:///run/containerd/s/f0c17114a6a77e8b68cea97d27a8038c0114ccc2600c918aa407359166d24a1f" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:08.570073 systemd[1]: Started cri-containerd-8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d.scope - libcontainer container 8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d. Jul 15 05:14:08.582375 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:08.631088 containerd[1622]: time="2025-07-15T05:14:08.630990513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-64cfc7b9f4-zc956,Uid:0e8612d6-0455-4a88-b312-2f91253d889c,Namespace:calico-system,Attempt:0,} returns sandbox id \"8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d\"" Jul 15 05:14:08.712918 kubelet[2951]: I0715 05:14:08.712857 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-2kxrm" podStartSLOduration=39.668127418 podStartE2EDuration="39.668127418s" podCreationTimestamp="2025-07-15 05:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:14:08.667754133 +0000 UTC m=+46.447064998" watchObservedRunningTime="2025-07-15 05:14:08.668127418 +0000 UTC m=+46.447438279" Jul 15 05:14:09.784087 systemd-networkd[1532]: calia8cef0ffee4: Gained IPv6LL Jul 15 05:14:09.848063 systemd-networkd[1532]: cali71ab6322162: Gained IPv6LL Jul 15 05:14:11.768423 containerd[1622]: time="2025-07-15T05:14:11.768384368Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:11.769341 containerd[1622]: time="2025-07-15T05:14:11.769225542Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Jul 15 05:14:11.769753 containerd[1622]: time="2025-07-15T05:14:11.769733805Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:11.771049 containerd[1622]: time="2025-07-15T05:14:11.771015403Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:11.771863 containerd[1622]: time="2025-07-15T05:14:11.771548387Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 5.512565095s" Jul 15 05:14:11.771863 containerd[1622]: time="2025-07-15T05:14:11.771572016Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:14:11.775166 containerd[1622]: time="2025-07-15T05:14:11.775135728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:14:11.775975 containerd[1622]: time="2025-07-15T05:14:11.775828638Z" level=info msg="CreateContainer within sandbox \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:14:11.789642 containerd[1622]: time="2025-07-15T05:14:11.789612928Z" level=info msg="Container b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:11.847256 containerd[1622]: time="2025-07-15T05:14:11.847172519Z" level=info msg="CreateContainer within sandbox \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\"" Jul 15 05:14:11.847511 containerd[1622]: time="2025-07-15T05:14:11.847497329Z" level=info msg="StartContainer for \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\"" Jul 15 05:14:11.849090 containerd[1622]: time="2025-07-15T05:14:11.849045254Z" level=info msg="connecting to shim b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c" address="unix:///run/containerd/s/edb69ba23473d27bc24c96d78413e4328620ba69ad034abdd5ed6ed95dd2ca9f" protocol=ttrpc version=3 Jul 15 05:14:11.908092 systemd[1]: Started cri-containerd-b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c.scope - libcontainer container b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c. Jul 15 05:14:11.971178 containerd[1622]: time="2025-07-15T05:14:11.971137584Z" level=info msg="StartContainer for \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" returns successfully" Jul 15 05:14:12.382587 containerd[1622]: time="2025-07-15T05:14:12.382137368Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:12.383301 containerd[1622]: time="2025-07-15T05:14:12.383284689Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 05:14:12.384370 containerd[1622]: time="2025-07-15T05:14:12.384341684Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 609.1758ms" Jul 15 05:14:12.384458 containerd[1622]: time="2025-07-15T05:14:12.384443312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:14:12.385871 containerd[1622]: time="2025-07-15T05:14:12.385360754Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 15 05:14:12.389478 containerd[1622]: time="2025-07-15T05:14:12.389165474Z" level=info msg="CreateContainer within sandbox \"db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:14:12.408480 containerd[1622]: time="2025-07-15T05:14:12.408460370Z" level=info msg="Container df310f99f6bca4e6b95904a11390f364dac5ea7e520f2979c95c23d1d26fc6ed: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:12.454820 containerd[1622]: time="2025-07-15T05:14:12.454794833Z" level=info msg="CreateContainer within sandbox \"db23cb679e1a2c9755af633931c1f3fcaff77dc5a2f0066c9f3651f1fd58a21a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"df310f99f6bca4e6b95904a11390f364dac5ea7e520f2979c95c23d1d26fc6ed\"" Jul 15 05:14:12.459605 containerd[1622]: time="2025-07-15T05:14:12.459440693Z" level=info msg="StartContainer for \"df310f99f6bca4e6b95904a11390f364dac5ea7e520f2979c95c23d1d26fc6ed\"" Jul 15 05:14:12.460272 containerd[1622]: time="2025-07-15T05:14:12.460260191Z" level=info msg="connecting to shim df310f99f6bca4e6b95904a11390f364dac5ea7e520f2979c95c23d1d26fc6ed" address="unix:///run/containerd/s/295a2013b3fb428e017fa31fdb52a64fa77d25194d5b1c49533da1e77127d279" protocol=ttrpc version=3 Jul 15 05:14:12.486053 systemd[1]: Started cri-containerd-df310f99f6bca4e6b95904a11390f364dac5ea7e520f2979c95c23d1d26fc6ed.scope - libcontainer container df310f99f6bca4e6b95904a11390f364dac5ea7e520f2979c95c23d1d26fc6ed. Jul 15 05:14:12.533379 containerd[1622]: time="2025-07-15T05:14:12.533311893Z" level=info msg="StartContainer for \"df310f99f6bca4e6b95904a11390f364dac5ea7e520f2979c95c23d1d26fc6ed\" returns successfully" Jul 15 05:14:12.757108 kubelet[2951]: I0715 05:14:12.757006 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-849d984976-9xsth" podStartSLOduration=25.831400463 podStartE2EDuration="32.756993262s" podCreationTimestamp="2025-07-15 05:13:40 +0000 UTC" firstStartedPulling="2025-07-15 05:14:04.846599066 +0000 UTC m=+42.625909925" lastFinishedPulling="2025-07-15 05:14:11.772191864 +0000 UTC m=+49.551502724" observedRunningTime="2025-07-15 05:14:12.733574022 +0000 UTC m=+50.512884884" watchObservedRunningTime="2025-07-15 05:14:12.756993262 +0000 UTC m=+50.536304121" Jul 15 05:14:12.824318 kubelet[2951]: I0715 05:14:12.823149 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5596dfd6cb-c84wc" podStartSLOduration=25.318764982 podStartE2EDuration="32.823137652s" podCreationTimestamp="2025-07-15 05:13:40 +0000 UTC" firstStartedPulling="2025-07-15 05:14:04.880636517 +0000 UTC m=+42.659947373" lastFinishedPulling="2025-07-15 05:14:12.385009182 +0000 UTC m=+50.164320043" observedRunningTime="2025-07-15 05:14:12.756584696 +0000 UTC m=+50.535895562" watchObservedRunningTime="2025-07-15 05:14:12.823137652 +0000 UTC m=+50.602448519" Jul 15 05:14:13.733935 kubelet[2951]: I0715 05:14:13.733139 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:14:13.734447 kubelet[2951]: I0715 05:14:13.734427 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:14:19.133679 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1338567962.mount: Deactivated successfully. Jul 15 05:14:21.198510 containerd[1622]: time="2025-07-15T05:14:21.198468727Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:21.213275 containerd[1622]: time="2025-07-15T05:14:21.213246571Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Jul 15 05:14:21.255312 containerd[1622]: time="2025-07-15T05:14:21.255250833Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:21.290440 containerd[1622]: time="2025-07-15T05:14:21.290392386Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:21.298131 containerd[1622]: time="2025-07-15T05:14:21.298046220Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 8.911965839s" Jul 15 05:14:21.298131 containerd[1622]: time="2025-07-15T05:14:21.298070264Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Jul 15 05:14:21.731754 containerd[1622]: time="2025-07-15T05:14:21.731725926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 15 05:14:21.768911 containerd[1622]: time="2025-07-15T05:14:21.768639690Z" level=info msg="CreateContainer within sandbox \"ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 15 05:14:21.793647 containerd[1622]: time="2025-07-15T05:14:21.793617316Z" level=info msg="Container 30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:21.796675 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount913612008.mount: Deactivated successfully. Jul 15 05:14:21.809370 containerd[1622]: time="2025-07-15T05:14:21.809066547Z" level=info msg="CreateContainer within sandbox \"ce54265a1e97cd61b09b661bb894883283b00d23a83af0bf22b9f3ea28982586\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\"" Jul 15 05:14:21.809524 containerd[1622]: time="2025-07-15T05:14:21.809513726Z" level=info msg="StartContainer for \"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\"" Jul 15 05:14:21.815606 containerd[1622]: time="2025-07-15T05:14:21.815557836Z" level=info msg="connecting to shim 30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311" address="unix:///run/containerd/s/ac746f37401f8f5b51ca7f122b4e215182aeba796c8dd243f799c68cb9f6934f" protocol=ttrpc version=3 Jul 15 05:14:21.946014 systemd[1]: Started cri-containerd-30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311.scope - libcontainer container 30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311. Jul 15 05:14:22.035800 containerd[1622]: time="2025-07-15T05:14:22.035739797Z" level=info msg="StartContainer for \"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\" returns successfully" Jul 15 05:14:23.161066 containerd[1622]: time="2025-07-15T05:14:23.161025493Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\" id:\"a7c1ffb53c1e41b44d0a4c3070d9998358d2c69138416eb86d57c56a10c19ebe\" pid:5336 exit_status:1 exited_at:{seconds:1752556463 nanos:116645481}" Jul 15 05:14:23.310103 containerd[1622]: time="2025-07-15T05:14:23.310076765Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:23.316447 containerd[1622]: time="2025-07-15T05:14:23.314397300Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Jul 15 05:14:23.322867 containerd[1622]: time="2025-07-15T05:14:23.319570199Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:23.325408 containerd[1622]: time="2025-07-15T05:14:23.325394451Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 1.593643593s" Jul 15 05:14:23.333456 containerd[1622]: time="2025-07-15T05:14:23.325457896Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Jul 15 05:14:23.333456 containerd[1622]: time="2025-07-15T05:14:23.326174825Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:14:23.333456 containerd[1622]: time="2025-07-15T05:14:23.329975000Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:23.338816 containerd[1622]: time="2025-07-15T05:14:23.338801925Z" level=info msg="CreateContainer within sandbox \"caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 15 05:14:23.365031 containerd[1622]: time="2025-07-15T05:14:23.365007369Z" level=info msg="Container 0bcbc7692373f50f41181926a1ec419fe2ef9abd1ef415ae00049bf9fa4f3ee3: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:23.369211 containerd[1622]: time="2025-07-15T05:14:23.369180271Z" level=info msg="CreateContainer within sandbox \"caa6352d83201d3fdceab970cf01694ca47ba893ecadb911f0a18e0b0c42c533\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"0bcbc7692373f50f41181926a1ec419fe2ef9abd1ef415ae00049bf9fa4f3ee3\"" Jul 15 05:14:23.370358 containerd[1622]: time="2025-07-15T05:14:23.369630617Z" level=info msg="StartContainer for \"0bcbc7692373f50f41181926a1ec419fe2ef9abd1ef415ae00049bf9fa4f3ee3\"" Jul 15 05:14:23.370457 containerd[1622]: time="2025-07-15T05:14:23.370426811Z" level=info msg="connecting to shim 0bcbc7692373f50f41181926a1ec419fe2ef9abd1ef415ae00049bf9fa4f3ee3" address="unix:///run/containerd/s/74141b00043189a4a640f59ef7cb1e31e2d39ce9569660251a9b42e3bbf237a9" protocol=ttrpc version=3 Jul 15 05:14:23.389279 systemd[1]: Started cri-containerd-0bcbc7692373f50f41181926a1ec419fe2ef9abd1ef415ae00049bf9fa4f3ee3.scope - libcontainer container 0bcbc7692373f50f41181926a1ec419fe2ef9abd1ef415ae00049bf9fa4f3ee3. Jul 15 05:14:23.440739 containerd[1622]: time="2025-07-15T05:14:23.440401195Z" level=info msg="StartContainer for \"0bcbc7692373f50f41181926a1ec419fe2ef9abd1ef415ae00049bf9fa4f3ee3\" returns successfully" Jul 15 05:14:23.774978 containerd[1622]: time="2025-07-15T05:14:23.774692982Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:23.775298 containerd[1622]: time="2025-07-15T05:14:23.775286170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 05:14:23.776923 containerd[1622]: time="2025-07-15T05:14:23.776848442Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 450.659844ms" Jul 15 05:14:23.777444 containerd[1622]: time="2025-07-15T05:14:23.776882864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:14:23.781117 containerd[1622]: time="2025-07-15T05:14:23.780863313Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 15 05:14:23.785737 containerd[1622]: time="2025-07-15T05:14:23.785711944Z" level=info msg="CreateContainer within sandbox \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:14:23.791358 containerd[1622]: time="2025-07-15T05:14:23.789724924Z" level=info msg="Container 5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:23.807792 containerd[1622]: time="2025-07-15T05:14:23.807768030Z" level=info msg="CreateContainer within sandbox \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\"" Jul 15 05:14:23.808924 containerd[1622]: time="2025-07-15T05:14:23.808903545Z" level=info msg="StartContainer for \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\"" Jul 15 05:14:23.809823 containerd[1622]: time="2025-07-15T05:14:23.809804867Z" level=info msg="connecting to shim 5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272" address="unix:///run/containerd/s/aaa30c4364e9ecc10d5163fd939f9325e4a1cced038ae4fda6d98dc407fb37ed" protocol=ttrpc version=3 Jul 15 05:14:23.835002 systemd[1]: Started cri-containerd-5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272.scope - libcontainer container 5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272. Jul 15 05:14:23.878264 containerd[1622]: time="2025-07-15T05:14:23.878237301Z" level=info msg="StartContainer for \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" returns successfully" Jul 15 05:14:23.995986 kubelet[2951]: I0715 05:14:23.991825 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-8nvjp" podStartSLOduration=27.111643064 podStartE2EDuration="42.985125884s" podCreationTimestamp="2025-07-15 05:13:41 +0000 UTC" firstStartedPulling="2025-07-15 05:14:05.690650678 +0000 UTC m=+43.469961535" lastFinishedPulling="2025-07-15 05:14:21.564133489 +0000 UTC m=+59.343444355" observedRunningTime="2025-07-15 05:14:23.037404553 +0000 UTC m=+60.816715419" watchObservedRunningTime="2025-07-15 05:14:23.985125884 +0000 UTC m=+61.764436745" Jul 15 05:14:23.996758 kubelet[2951]: I0715 05:14:23.996657 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-849d984976-ml4gq" podStartSLOduration=28.368917012 podStartE2EDuration="43.996645582s" podCreationTimestamp="2025-07-15 05:13:40 +0000 UTC" firstStartedPulling="2025-07-15 05:14:08.153047356 +0000 UTC m=+45.932358213" lastFinishedPulling="2025-07-15 05:14:23.780775925 +0000 UTC m=+61.560086783" observedRunningTime="2025-07-15 05:14:23.981229729 +0000 UTC m=+61.760540589" watchObservedRunningTime="2025-07-15 05:14:23.996645582 +0000 UTC m=+61.775956443" Jul 15 05:14:24.289708 containerd[1622]: time="2025-07-15T05:14:24.289592174Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\" id:\"075d726d9cdd4c79aac0f25f07b834e3f4dc765a05301a6f3ee02110e4466865\" pid:5425 exit_status:1 exited_at:{seconds:1752556464 nanos:288381056}" Jul 15 05:14:24.574560 kubelet[2951]: I0715 05:14:24.571108 2951 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 15 05:14:24.577222 kubelet[2951]: I0715 05:14:24.577058 2951 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 15 05:14:24.929568 kubelet[2951]: I0715 05:14:24.927759 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5cx6x" podStartSLOduration=24.441930398 podStartE2EDuration="42.924059055s" podCreationTimestamp="2025-07-15 05:13:42 +0000 UTC" firstStartedPulling="2025-07-15 05:14:04.843717971 +0000 UTC m=+42.623028828" lastFinishedPulling="2025-07-15 05:14:23.325846628 +0000 UTC m=+61.105157485" observedRunningTime="2025-07-15 05:14:24.011080952 +0000 UTC m=+61.790391818" watchObservedRunningTime="2025-07-15 05:14:24.924059055 +0000 UTC m=+62.703369921" Jul 15 05:14:25.041678 containerd[1622]: time="2025-07-15T05:14:25.041641914Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\" id:\"57da84ac320cf6ca5343a6864564a2c4754620c2418d5b74c9cc74fa86f65de6\" pid:5452 exit_status:1 exited_at:{seconds:1752556465 nanos:38842274}" Jul 15 05:14:26.475253 containerd[1622]: time="2025-07-15T05:14:26.475149209Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:26.480657 containerd[1622]: time="2025-07-15T05:14:26.480632442Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Jul 15 05:14:26.501440 containerd[1622]: time="2025-07-15T05:14:26.500358775Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:26.514474 containerd[1622]: time="2025-07-15T05:14:26.514447815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:26.515965 containerd[1622]: time="2025-07-15T05:14:26.515937212Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 2.734542081s" Jul 15 05:14:26.516072 containerd[1622]: time="2025-07-15T05:14:26.516060513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Jul 15 05:14:26.613446 containerd[1622]: time="2025-07-15T05:14:26.613416152Z" level=info msg="CreateContainer within sandbox \"8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 15 05:14:26.649256 containerd[1622]: time="2025-07-15T05:14:26.647966985Z" level=info msg="Container f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:26.652383 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount972960200.mount: Deactivated successfully. Jul 15 05:14:26.700816 containerd[1622]: time="2025-07-15T05:14:26.700597441Z" level=info msg="CreateContainer within sandbox \"8694c8f4e5a63a2361d3c5c20062f6b20b1da1c672d1ce5f40994899c27d541d\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64\"" Jul 15 05:14:26.704478 containerd[1622]: time="2025-07-15T05:14:26.703789174Z" level=info msg="StartContainer for \"f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64\"" Jul 15 05:14:26.720219 containerd[1622]: time="2025-07-15T05:14:26.718411314Z" level=info msg="connecting to shim f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64" address="unix:///run/containerd/s/f0c17114a6a77e8b68cea97d27a8038c0114ccc2600c918aa407359166d24a1f" protocol=ttrpc version=3 Jul 15 05:14:26.784009 systemd[1]: Started cri-containerd-f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64.scope - libcontainer container f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64. Jul 15 05:14:26.850713 containerd[1622]: time="2025-07-15T05:14:26.850679424Z" level=info msg="StartContainer for \"f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64\" returns successfully" Jul 15 05:14:26.964938 kubelet[2951]: I0715 05:14:26.964886 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-64cfc7b9f4-zc956" podStartSLOduration=27.078541655 podStartE2EDuration="44.964871754s" podCreationTimestamp="2025-07-15 05:13:42 +0000 UTC" firstStartedPulling="2025-07-15 05:14:08.63227403 +0000 UTC m=+46.411584887" lastFinishedPulling="2025-07-15 05:14:26.518604125 +0000 UTC m=+64.297914986" observedRunningTime="2025-07-15 05:14:26.964490888 +0000 UTC m=+64.743801750" watchObservedRunningTime="2025-07-15 05:14:26.964871754 +0000 UTC m=+64.744182615" Jul 15 05:14:27.185108 containerd[1622]: time="2025-07-15T05:14:27.185079682Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64\" id:\"7c49bd07163c2727f47ea6d6aca4d70c8d599aed379713e308bf21ba0d85841e\" pid:5518 exited_at:{seconds:1752556467 nanos:178988137}" Jul 15 05:14:28.638170 containerd[1622]: time="2025-07-15T05:14:28.638134682Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998\" id:\"985e479f905ef4401eef5c34a82ca9992961193d8664e0339ec8bf31e1a5797c\" pid:5541 exited_at:{seconds:1752556468 nanos:637845152}" Jul 15 05:14:32.760796 kubelet[2951]: I0715 05:14:32.760748 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:14:46.410762 systemd[1]: Started sshd@7-139.178.70.102:22-147.75.109.163:39276.service - OpenSSH per-connection server daemon (147.75.109.163:39276). Jul 15 05:14:46.540284 sshd[5587]: Accepted publickey for core from 147.75.109.163 port 39276 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:14:46.544466 sshd-session[5587]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:46.549944 systemd-logind[1606]: New session 10 of user core. Jul 15 05:14:46.558203 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 15 05:14:47.089356 sshd[5590]: Connection closed by 147.75.109.163 port 39276 Jul 15 05:14:47.090493 sshd-session[5587]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:47.099310 systemd[1]: sshd@7-139.178.70.102:22-147.75.109.163:39276.service: Deactivated successfully. Jul 15 05:14:47.100491 systemd[1]: session-10.scope: Deactivated successfully. Jul 15 05:14:47.104442 systemd-logind[1606]: Session 10 logged out. Waiting for processes to exit. Jul 15 05:14:47.105435 systemd-logind[1606]: Removed session 10. Jul 15 05:14:50.337062 kubelet[2951]: I0715 05:14:50.337004 2951 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:14:50.694596 systemd[1]: Created slice kubepods-besteffort-podf74f33cb_4f2a_43e5_baac_9cfe205163d3.slice - libcontainer container kubepods-besteffort-podf74f33cb_4f2a_43e5_baac_9cfe205163d3.slice. Jul 15 05:14:50.808322 containerd[1622]: time="2025-07-15T05:14:50.808287181Z" level=info msg="StopContainer for \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" with timeout 30 (s)" Jul 15 05:14:50.931564 kubelet[2951]: I0715 05:14:50.931453 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6m9n\" (UniqueName: \"kubernetes.io/projected/f74f33cb-4f2a-43e5-baac-9cfe205163d3-kube-api-access-g6m9n\") pod \"calico-apiserver-5596dfd6cb-wwjct\" (UID: \"f74f33cb-4f2a-43e5-baac-9cfe205163d3\") " pod="calico-apiserver/calico-apiserver-5596dfd6cb-wwjct" Jul 15 05:14:50.931564 kubelet[2951]: I0715 05:14:50.931526 2951 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f74f33cb-4f2a-43e5-baac-9cfe205163d3-calico-apiserver-certs\") pod \"calico-apiserver-5596dfd6cb-wwjct\" (UID: \"f74f33cb-4f2a-43e5-baac-9cfe205163d3\") " pod="calico-apiserver/calico-apiserver-5596dfd6cb-wwjct" Jul 15 05:14:50.936847 containerd[1622]: time="2025-07-15T05:14:50.936723442Z" level=info msg="Stop container \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" with signal terminated" Jul 15 05:14:51.163077 systemd[1]: cri-containerd-5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272.scope: Deactivated successfully. Jul 15 05:14:51.163290 systemd[1]: cri-containerd-5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272.scope: Consumed 763ms CPU time, 66.4M memory peak, 16.7M read from disk. Jul 15 05:14:51.215769 containerd[1622]: time="2025-07-15T05:14:51.215734673Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" id:\"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" pid:5391 exit_status:1 exited_at:{seconds:1752556491 nanos:186687191}" Jul 15 05:14:51.219587 containerd[1622]: time="2025-07-15T05:14:51.219565534Z" level=info msg="received exit event container_id:\"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" id:\"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" pid:5391 exit_status:1 exited_at:{seconds:1752556491 nanos:186687191}" Jul 15 05:14:51.236425 containerd[1622]: time="2025-07-15T05:14:51.236079824Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64\" id:\"8d66bf70344fa9c2526f2822bf96900d5c3076d497f8ab5f57b3462876cd7e0c\" pid:5631 exited_at:{seconds:1752556491 nanos:235679335}" Jul 15 05:14:51.271675 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272-rootfs.mount: Deactivated successfully. Jul 15 05:14:51.322068 containerd[1622]: time="2025-07-15T05:14:51.321997339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5596dfd6cb-wwjct,Uid:f74f33cb-4f2a-43e5-baac-9cfe205163d3,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:14:51.340190 containerd[1622]: time="2025-07-15T05:14:51.339492734Z" level=info msg="StopContainer for \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" returns successfully" Jul 15 05:14:51.393066 containerd[1622]: time="2025-07-15T05:14:51.393000504Z" level=info msg="StopPodSandbox for \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\"" Jul 15 05:14:51.397376 containerd[1622]: time="2025-07-15T05:14:51.397226303Z" level=info msg="Container to stop \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 15 05:14:51.413415 systemd[1]: cri-containerd-962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912.scope: Deactivated successfully. Jul 15 05:14:51.419047 containerd[1622]: time="2025-07-15T05:14:51.418969868Z" level=info msg="TaskExit event in podsandbox handler container_id:\"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" id:\"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" pid:5076 exit_status:137 exited_at:{seconds:1752556491 nanos:418131972}" Jul 15 05:14:51.441961 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912-rootfs.mount: Deactivated successfully. Jul 15 05:14:51.495391 containerd[1622]: time="2025-07-15T05:14:51.494684083Z" level=info msg="received exit event sandbox_id:\"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" exit_status:137 exited_at:{seconds:1752556491 nanos:418131972}" Jul 15 05:14:51.497330 containerd[1622]: time="2025-07-15T05:14:51.497247450Z" level=info msg="shim disconnected" id=962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912 namespace=k8s.io Jul 15 05:14:51.497710 containerd[1622]: time="2025-07-15T05:14:51.497699011Z" level=warning msg="cleaning up after shim disconnected" id=962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912 namespace=k8s.io Jul 15 05:14:51.525675 containerd[1622]: time="2025-07-15T05:14:51.498429213Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 15 05:14:52.101044 systemd-networkd[1532]: calia8cef0ffee4: Link DOWN Jul 15 05:14:52.101055 systemd-networkd[1532]: calia8cef0ffee4: Lost carrier Jul 15 05:14:52.112492 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912-shm.mount: Deactivated successfully. Jul 15 05:14:52.114695 systemd[1]: Started sshd@8-139.178.70.102:22-147.75.109.163:52764.service - OpenSSH per-connection server daemon (147.75.109.163:52764). Jul 15 05:14:52.179068 kubelet[2951]: I0715 05:14:52.179043 2951 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:14:52.300316 sshd[5730]: Accepted publickey for core from 147.75.109.163 port 52764 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:14:52.303352 sshd-session[5730]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:52.310265 systemd-logind[1606]: New session 11 of user core. Jul 15 05:14:52.315015 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 15 05:14:52.658331 systemd-networkd[1532]: cali21ca2de93ff: Link UP Jul 15 05:14:52.659286 systemd-networkd[1532]: cali21ca2de93ff: Gained carrier Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.069 [INFO][5678] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0 calico-apiserver-5596dfd6cb- calico-apiserver f74f33cb-4f2a-43e5-baac-9cfe205163d3 1199 0 2025-07-15 05:14:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5596dfd6cb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5596dfd6cb-wwjct eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali21ca2de93ff [] [] }} ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-wwjct" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.074 [INFO][5678] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-wwjct" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.530 [INFO][5729] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" HandleID="k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Workload="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.532 [INFO][5729] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" HandleID="k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Workload="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000342420), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5596dfd6cb-wwjct", "timestamp":"2025-07-15 05:14:52.530665041 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.533 [INFO][5729] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.533 [INFO][5729] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.534 [INFO][5729] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.568 [INFO][5729] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.583 [INFO][5729] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.595 [INFO][5729] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.599 [INFO][5729] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.603 [INFO][5729] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.603 [INFO][5729] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.610 [INFO][5729] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45 Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.615 [INFO][5729] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.629 [INFO][5729] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.138/26] block=192.168.88.128/26 handle="k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.629 [INFO][5729] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.138/26] handle="k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" host="localhost" Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.629 [INFO][5729] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:52.699339 containerd[1622]: 2025-07-15 05:14:52.629 [INFO][5729] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.138/26] IPv6=[] ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" HandleID="k8s-pod-network.1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Workload="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" Jul 15 05:14:52.710274 containerd[1622]: 2025-07-15 05:14:52.633 [INFO][5678] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-wwjct" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0", GenerateName:"calico-apiserver-5596dfd6cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"f74f33cb-4f2a-43e5-baac-9cfe205163d3", ResourceVersion:"1199", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5596dfd6cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5596dfd6cb-wwjct", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali21ca2de93ff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:52.710274 containerd[1622]: 2025-07-15 05:14:52.636 [INFO][5678] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.138/32] ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-wwjct" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" Jul 15 05:14:52.710274 containerd[1622]: 2025-07-15 05:14:52.636 [INFO][5678] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali21ca2de93ff ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-wwjct" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" Jul 15 05:14:52.710274 containerd[1622]: 2025-07-15 05:14:52.662 [INFO][5678] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-wwjct" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" Jul 15 05:14:52.710274 containerd[1622]: 2025-07-15 05:14:52.666 [INFO][5678] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-wwjct" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0", GenerateName:"calico-apiserver-5596dfd6cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"f74f33cb-4f2a-43e5-baac-9cfe205163d3", ResourceVersion:"1199", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5596dfd6cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45", Pod:"calico-apiserver-5596dfd6cb-wwjct", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali21ca2de93ff", MAC:"36:87:67:d0:04:8a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:14:52.710274 containerd[1622]: 2025-07-15 05:14:52.692 [INFO][5678] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" Namespace="calico-apiserver" Pod="calico-apiserver-5596dfd6cb-wwjct" WorkloadEndpoint="localhost-k8s-calico--apiserver--5596dfd6cb--wwjct-eth0" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.078 [INFO][5709] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.079 [INFO][5709] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" iface="eth0" netns="/var/run/netns/cni-e1dc180b-dcf6-2f5f-e55d-3c82430a8a16" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.079 [INFO][5709] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" iface="eth0" netns="/var/run/netns/cni-e1dc180b-dcf6-2f5f-e55d-3c82430a8a16" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.086 [INFO][5709] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" after=7.856564ms iface="eth0" netns="/var/run/netns/cni-e1dc180b-dcf6-2f5f-e55d-3c82430a8a16" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.087 [INFO][5709] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.087 [INFO][5709] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.531 [INFO][5723] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.534 [INFO][5723] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.629 [INFO][5723] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.719 [INFO][5723] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.719 [INFO][5723] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.720 [INFO][5723] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:14:52.728621 containerd[1622]: 2025-07-15 05:14:52.725 [INFO][5709] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:14:52.732147 systemd[1]: run-netns-cni\x2de1dc180b\x2ddcf6\x2d2f5f\x2de55d\x2d3c82430a8a16.mount: Deactivated successfully. Jul 15 05:14:52.736961 containerd[1622]: time="2025-07-15T05:14:52.736651192Z" level=info msg="TearDown network for sandbox \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" successfully" Jul 15 05:14:52.736961 containerd[1622]: time="2025-07-15T05:14:52.736686569Z" level=info msg="StopPodSandbox for \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" returns successfully" Jul 15 05:14:52.893824 kubelet[2951]: I0715 05:14:52.892637 2951 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fgzb\" (UniqueName: \"kubernetes.io/projected/20e10742-00f4-4932-9581-20e919e16af1-kube-api-access-6fgzb\") pod \"20e10742-00f4-4932-9581-20e919e16af1\" (UID: \"20e10742-00f4-4932-9581-20e919e16af1\") " Jul 15 05:14:52.893824 kubelet[2951]: I0715 05:14:52.892799 2951 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/20e10742-00f4-4932-9581-20e919e16af1-calico-apiserver-certs\") pod \"20e10742-00f4-4932-9581-20e919e16af1\" (UID: \"20e10742-00f4-4932-9581-20e919e16af1\") " Jul 15 05:14:52.916106 systemd[1]: var-lib-kubelet-pods-20e10742\x2d00f4\x2d4932\x2d9581\x2d20e919e16af1-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6fgzb.mount: Deactivated successfully. Jul 15 05:14:52.923271 systemd[1]: var-lib-kubelet-pods-20e10742\x2d00f4\x2d4932\x2d9581\x2d20e919e16af1-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 15 05:14:53.229319 kubelet[2951]: I0715 05:14:53.189852 2951 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e10742-00f4-4932-9581-20e919e16af1-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "20e10742-00f4-4932-9581-20e919e16af1" (UID: "20e10742-00f4-4932-9581-20e919e16af1"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 05:14:53.244569 containerd[1622]: time="2025-07-15T05:14:53.244420577Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64\" id:\"f33ef6077745cacdb2c20870b57468d231f1438df0f2554ba777187b9616368c\" pid:5803 exited_at:{seconds:1752556493 nanos:243735270}" Jul 15 05:14:53.246299 kubelet[2951]: I0715 05:14:53.246203 2951 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/20e10742-00f4-4932-9581-20e919e16af1-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Jul 15 05:14:53.246299 kubelet[2951]: I0715 05:14:53.141989 2951 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e10742-00f4-4932-9581-20e919e16af1-kube-api-access-6fgzb" (OuterVolumeSpecName: "kube-api-access-6fgzb") pod "20e10742-00f4-4932-9581-20e919e16af1" (UID: "20e10742-00f4-4932-9581-20e919e16af1"). InnerVolumeSpecName "kube-api-access-6fgzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 05:14:53.295406 systemd[1]: Removed slice kubepods-besteffort-pod20e10742_00f4_4932_9581_20e919e16af1.slice - libcontainer container kubepods-besteffort-pod20e10742_00f4_4932_9581_20e919e16af1.slice. Jul 15 05:14:53.295466 systemd[1]: kubepods-besteffort-pod20e10742_00f4_4932_9581_20e919e16af1.slice: Consumed 784ms CPU time, 67M memory peak, 16.7M read from disk. Jul 15 05:14:53.377875 kubelet[2951]: I0715 05:14:53.377051 2951 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fgzb\" (UniqueName: \"kubernetes.io/projected/20e10742-00f4-4932-9581-20e919e16af1-kube-api-access-6fgzb\") on node \"localhost\" DevicePath \"\"" Jul 15 05:14:53.505004 containerd[1622]: time="2025-07-15T05:14:53.504729655Z" level=info msg="connecting to shim 1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45" address="unix:///run/containerd/s/e9fda7583e921d6baa1a89c869ff2a44045f96b2a1e100fc5284386b040f0a3e" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:53.979385 systemd[1]: Started cri-containerd-1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45.scope - libcontainer container 1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45. Jul 15 05:14:54.022659 systemd-resolved[1533]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 15 05:14:54.137290 systemd-networkd[1532]: cali21ca2de93ff: Gained IPv6LL Jul 15 05:14:54.873853 containerd[1622]: time="2025-07-15T05:14:54.873788170Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\" id:\"52933341e9585888d07c4954682eafdd58308b628358e95f18761c1a57a0fc4b\" pid:5801 exited_at:{seconds:1752556494 nanos:827585865}" Jul 15 05:14:55.400757 containerd[1622]: time="2025-07-15T05:14:55.400725011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5596dfd6cb-wwjct,Uid:f74f33cb-4f2a-43e5-baac-9cfe205163d3,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45\"" Jul 15 05:14:56.261675 kubelet[2951]: I0715 05:14:56.261643 2951 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e10742-00f4-4932-9581-20e919e16af1" path="/var/lib/kubelet/pods/20e10742-00f4-4932-9581-20e919e16af1/volumes" Jul 15 05:14:56.427927 kubelet[2951]: E0715 05:14:56.427835 2951 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf74f33cb_4f2a_43e5_baac_9cfe205163d3.slice/cri-containerd-1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45.scope\": RecentStats: unable to find data in memory cache]" Jul 15 05:14:56.452103 containerd[1622]: time="2025-07-15T05:14:56.452074969Z" level=info msg="CreateContainer within sandbox \"1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:14:56.560495 containerd[1622]: time="2025-07-15T05:14:56.560431186Z" level=info msg="Container 3cbf60641ff63826ca439c65dd616ef98f29e7ce4d8c59605a960eecf8ce6dbb: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:56.578486 containerd[1622]: time="2025-07-15T05:14:56.578449937Z" level=info msg="CreateContainer within sandbox \"1b10f0acffebc78ccda2fb54fbc437ad255c6cd92b3a76406ae450283334fe45\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3cbf60641ff63826ca439c65dd616ef98f29e7ce4d8c59605a960eecf8ce6dbb\"" Jul 15 05:14:56.583391 containerd[1622]: time="2025-07-15T05:14:56.583367710Z" level=info msg="StartContainer for \"3cbf60641ff63826ca439c65dd616ef98f29e7ce4d8c59605a960eecf8ce6dbb\"" Jul 15 05:14:56.594968 sshd[5746]: Connection closed by 147.75.109.163 port 52764 Jul 15 05:14:56.602267 sshd-session[5730]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:56.613599 containerd[1622]: time="2025-07-15T05:14:56.611103942Z" level=info msg="connecting to shim 3cbf60641ff63826ca439c65dd616ef98f29e7ce4d8c59605a960eecf8ce6dbb" address="unix:///run/containerd/s/e9fda7583e921d6baa1a89c869ff2a44045f96b2a1e100fc5284386b040f0a3e" protocol=ttrpc version=3 Jul 15 05:14:56.640410 systemd[1]: sshd@8-139.178.70.102:22-147.75.109.163:52764.service: Deactivated successfully. Jul 15 05:14:56.642407 systemd[1]: session-11.scope: Deactivated successfully. Jul 15 05:14:56.643399 systemd-logind[1606]: Session 11 logged out. Waiting for processes to exit. Jul 15 05:14:56.650034 systemd[1]: Started cri-containerd-3cbf60641ff63826ca439c65dd616ef98f29e7ce4d8c59605a960eecf8ce6dbb.scope - libcontainer container 3cbf60641ff63826ca439c65dd616ef98f29e7ce4d8c59605a960eecf8ce6dbb. Jul 15 05:14:56.659582 systemd[1]: Started sshd@9-139.178.70.102:22-147.75.109.163:52768.service - OpenSSH per-connection server daemon (147.75.109.163:52768). Jul 15 05:14:56.660396 systemd-logind[1606]: Removed session 11. Jul 15 05:14:56.703873 containerd[1622]: time="2025-07-15T05:14:56.703848427Z" level=info msg="StartContainer for \"3cbf60641ff63826ca439c65dd616ef98f29e7ce4d8c59605a960eecf8ce6dbb\" returns successfully" Jul 15 05:14:56.783416 sshd[5888]: Accepted publickey for core from 147.75.109.163 port 52768 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:14:56.785088 sshd-session[5888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:56.792555 systemd-logind[1606]: New session 12 of user core. Jul 15 05:14:56.797977 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 15 05:14:57.143610 sshd[5908]: Connection closed by 147.75.109.163 port 52768 Jul 15 05:14:57.144674 sshd-session[5888]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:57.153253 systemd[1]: sshd@9-139.178.70.102:22-147.75.109.163:52768.service: Deactivated successfully. Jul 15 05:14:57.154375 systemd[1]: session-12.scope: Deactivated successfully. Jul 15 05:14:57.155579 systemd-logind[1606]: Session 12 logged out. Waiting for processes to exit. Jul 15 05:14:57.159350 systemd[1]: Started sshd@10-139.178.70.102:22-147.75.109.163:52774.service - OpenSSH per-connection server daemon (147.75.109.163:52774). Jul 15 05:14:57.160223 systemd-logind[1606]: Removed session 12. Jul 15 05:14:57.232917 sshd[5919]: Accepted publickey for core from 147.75.109.163 port 52774 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:14:57.234004 sshd-session[5919]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:57.237850 systemd-logind[1606]: New session 13 of user core. Jul 15 05:14:57.243069 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 15 05:14:57.402880 sshd[5922]: Connection closed by 147.75.109.163 port 52774 Jul 15 05:14:57.402525 sshd-session[5919]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:57.405889 systemd[1]: sshd@10-139.178.70.102:22-147.75.109.163:52774.service: Deactivated successfully. Jul 15 05:14:57.407504 systemd[1]: session-13.scope: Deactivated successfully. Jul 15 05:14:57.409599 systemd-logind[1606]: Session 13 logged out. Waiting for processes to exit. Jul 15 05:14:57.410394 systemd-logind[1606]: Removed session 13. Jul 15 05:14:57.502586 kubelet[2951]: I0715 05:14:57.483407 2951 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5596dfd6cb-wwjct" podStartSLOduration=7.474934199 podStartE2EDuration="7.474934199s" podCreationTimestamp="2025-07-15 05:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:14:57.472481517 +0000 UTC m=+95.251792382" watchObservedRunningTime="2025-07-15 05:14:57.474934199 +0000 UTC m=+95.254245066" Jul 15 05:14:59.042366 containerd[1622]: time="2025-07-15T05:14:59.042270908Z" level=info msg="StopContainer for \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" with timeout 30 (s)" Jul 15 05:14:59.068077 containerd[1622]: time="2025-07-15T05:14:59.067983682Z" level=info msg="Stop container \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" with signal terminated" Jul 15 05:14:59.093721 systemd[1]: cri-containerd-b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c.scope: Deactivated successfully. Jul 15 05:14:59.093959 systemd[1]: cri-containerd-b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c.scope: Consumed 805ms CPU time, 47.4M memory peak, 2.2M read from disk. Jul 15 05:14:59.105420 containerd[1622]: time="2025-07-15T05:14:59.105396794Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" id:\"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" pid:5192 exit_status:1 exited_at:{seconds:1752556499 nanos:103277871}" Jul 15 05:14:59.108516 containerd[1622]: time="2025-07-15T05:14:59.108269421Z" level=info msg="received exit event container_id:\"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" id:\"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" pid:5192 exit_status:1 exited_at:{seconds:1752556499 nanos:103277871}" Jul 15 05:14:59.140439 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c-rootfs.mount: Deactivated successfully. Jul 15 05:14:59.165914 containerd[1622]: time="2025-07-15T05:14:59.165844975Z" level=info msg="StopContainer for \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" returns successfully" Jul 15 05:14:59.166551 containerd[1622]: time="2025-07-15T05:14:59.166517640Z" level=info msg="StopPodSandbox for \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\"" Jul 15 05:14:59.170663 containerd[1622]: time="2025-07-15T05:14:59.170627191Z" level=info msg="Container to stop \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 15 05:14:59.196085 systemd[1]: cri-containerd-c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458.scope: Deactivated successfully. Jul 15 05:14:59.201731 containerd[1622]: time="2025-07-15T05:14:59.199024708Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" id:\"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" pid:4628 exit_status:137 exited_at:{seconds:1752556499 nanos:197181344}" Jul 15 05:14:59.229105 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458-rootfs.mount: Deactivated successfully. Jul 15 05:14:59.229798 containerd[1622]: time="2025-07-15T05:14:59.229780778Z" level=info msg="shim disconnected" id=c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458 namespace=k8s.io Jul 15 05:14:59.229880 containerd[1622]: time="2025-07-15T05:14:59.229852452Z" level=warning msg="cleaning up after shim disconnected" id=c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458 namespace=k8s.io Jul 15 05:14:59.234982 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458-shm.mount: Deactivated successfully. Jul 15 05:14:59.254879 containerd[1622]: time="2025-07-15T05:14:59.229860856Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 15 05:14:59.268269 containerd[1622]: time="2025-07-15T05:14:59.268138029Z" level=info msg="received exit event sandbox_id:\"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" exit_status:137 exited_at:{seconds:1752556499 nanos:197181344}" Jul 15 05:14:59.560385 containerd[1622]: time="2025-07-15T05:14:59.559514427Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998\" id:\"05db550890841e62202244b995a612a8c6384c08c9b86c1104b892916d47b199\" pid:5954 exited_at:{seconds:1752556499 nanos:559284601}" Jul 15 05:14:59.573438 kubelet[2951]: I0715 05:14:59.540520 2951 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:00.902449 systemd-networkd[1532]: calidab1d4645df: Link DOWN Jul 15 05:15:00.902453 systemd-networkd[1532]: calidab1d4645df: Lost carrier Jul 15 05:15:01.298095 containerd[1622]: time="2025-07-15T05:15:01.297977797Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\" id:\"792322506851cb965290af58031e9f95d024b03a6afdce9d97fec83241f9c2c4\" pid:6050 exited_at:{seconds:1752556501 nanos:297651066}" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:00.853 [INFO][6036] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:00.862 [INFO][6036] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" iface="eth0" netns="/var/run/netns/cni-95c5ace7-7c52-b40a-af04-8c9a33ed5b39" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:00.866 [INFO][6036] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" iface="eth0" netns="/var/run/netns/cni-95c5ace7-7c52-b40a-af04-8c9a33ed5b39" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:00.881 [INFO][6036] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" after=18.677826ms iface="eth0" netns="/var/run/netns/cni-95c5ace7-7c52-b40a-af04-8c9a33ed5b39" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:00.881 [INFO][6036] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:00.881 [INFO][6036] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:01.768 [INFO][6073] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:01.774 [INFO][6073] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:01.774 [INFO][6073] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:01.870 [INFO][6073] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:01.870 [INFO][6073] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:01.883 [INFO][6073] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:01.910588 containerd[1622]: 2025-07-15 05:15:01.887 [INFO][6036] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:01.963050 containerd[1622]: time="2025-07-15T05:15:01.963016442Z" level=info msg="TearDown network for sandbox \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" successfully" Jul 15 05:15:01.963050 containerd[1622]: time="2025-07-15T05:15:01.963047191Z" level=info msg="StopPodSandbox for \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" returns successfully" Jul 15 05:15:01.974345 systemd[1]: run-netns-cni\x2d95c5ace7\x2d7c52\x2db40a\x2daf04\x2d8c9a33ed5b39.mount: Deactivated successfully. Jul 15 05:15:02.444150 systemd[1]: Started sshd@11-139.178.70.102:22-147.75.109.163:50392.service - OpenSSH per-connection server daemon (147.75.109.163:50392). Jul 15 05:15:02.699563 sshd[6091]: Accepted publickey for core from 147.75.109.163 port 50392 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:02.706093 sshd-session[6091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:02.723017 systemd-logind[1606]: New session 14 of user core. Jul 15 05:15:02.728266 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 15 05:15:02.848830 kubelet[2951]: I0715 05:15:02.848787 2951 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/21b748ad-b2a9-4a0e-8b89-475f2acae024-calico-apiserver-certs\") pod \"21b748ad-b2a9-4a0e-8b89-475f2acae024\" (UID: \"21b748ad-b2a9-4a0e-8b89-475f2acae024\") " Jul 15 05:15:02.859831 kubelet[2951]: I0715 05:15:02.852041 2951 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4m4n\" (UniqueName: \"kubernetes.io/projected/21b748ad-b2a9-4a0e-8b89-475f2acae024-kube-api-access-s4m4n\") pod \"21b748ad-b2a9-4a0e-8b89-475f2acae024\" (UID: \"21b748ad-b2a9-4a0e-8b89-475f2acae024\") " Jul 15 05:15:03.001382 systemd[1]: var-lib-kubelet-pods-21b748ad\x2db2a9\x2d4a0e\x2d8b89\x2d475f2acae024-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ds4m4n.mount: Deactivated successfully. Jul 15 05:15:03.001455 systemd[1]: var-lib-kubelet-pods-21b748ad\x2db2a9\x2d4a0e\x2d8b89\x2d475f2acae024-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 15 05:15:03.019603 kubelet[2951]: I0715 05:15:03.010640 2951 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b748ad-b2a9-4a0e-8b89-475f2acae024-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "21b748ad-b2a9-4a0e-8b89-475f2acae024" (UID: "21b748ad-b2a9-4a0e-8b89-475f2acae024"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 05:15:03.024245 kubelet[2951]: I0715 05:15:03.024228 2951 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b748ad-b2a9-4a0e-8b89-475f2acae024-kube-api-access-s4m4n" (OuterVolumeSpecName: "kube-api-access-s4m4n") pod "21b748ad-b2a9-4a0e-8b89-475f2acae024" (UID: "21b748ad-b2a9-4a0e-8b89-475f2acae024"). InnerVolumeSpecName "kube-api-access-s4m4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 05:15:03.085233 kubelet[2951]: I0715 05:15:03.085104 2951 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/21b748ad-b2a9-4a0e-8b89-475f2acae024-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Jul 15 05:15:03.085233 kubelet[2951]: I0715 05:15:03.085124 2951 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4m4n\" (UniqueName: \"kubernetes.io/projected/21b748ad-b2a9-4a0e-8b89-475f2acae024-kube-api-access-s4m4n\") on node \"localhost\" DevicePath \"\"" Jul 15 05:15:03.303364 systemd[1]: Removed slice kubepods-besteffort-pod21b748ad_b2a9_4a0e_8b89_475f2acae024.slice - libcontainer container kubepods-besteffort-pod21b748ad_b2a9_4a0e_8b89_475f2acae024.slice. Jul 15 05:15:03.303633 systemd[1]: kubepods-besteffort-pod21b748ad_b2a9_4a0e_8b89_475f2acae024.slice: Consumed 828ms CPU time, 48M memory peak, 2.2M read from disk. Jul 15 05:15:03.951084 sshd[6094]: Connection closed by 147.75.109.163 port 50392 Jul 15 05:15:03.955865 sshd-session[6091]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:03.973442 systemd[1]: sshd@11-139.178.70.102:22-147.75.109.163:50392.service: Deactivated successfully. Jul 15 05:15:03.978635 systemd[1]: session-14.scope: Deactivated successfully. Jul 15 05:15:03.979750 systemd-logind[1606]: Session 14 logged out. Waiting for processes to exit. Jul 15 05:15:03.980821 systemd-logind[1606]: Removed session 14. Jul 15 05:15:04.353090 kubelet[2951]: I0715 05:15:04.343577 2951 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b748ad-b2a9-4a0e-8b89-475f2acae024" path="/var/lib/kubelet/pods/21b748ad-b2a9-4a0e-8b89-475f2acae024/volumes" Jul 15 05:15:08.973768 systemd[1]: Started sshd@12-139.178.70.102:22-147.75.109.163:39442.service - OpenSSH per-connection server daemon (147.75.109.163:39442). Jul 15 05:15:09.264157 sshd[6122]: Accepted publickey for core from 147.75.109.163 port 39442 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:09.284125 sshd-session[6122]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:09.289500 systemd-logind[1606]: New session 15 of user core. Jul 15 05:15:09.297039 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 15 05:15:09.928923 sshd[6125]: Connection closed by 147.75.109.163 port 39442 Jul 15 05:15:09.929503 sshd-session[6122]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:09.932198 systemd[1]: sshd@12-139.178.70.102:22-147.75.109.163:39442.service: Deactivated successfully. Jul 15 05:15:09.935566 systemd[1]: session-15.scope: Deactivated successfully. Jul 15 05:15:09.938113 systemd-logind[1606]: Session 15 logged out. Waiting for processes to exit. Jul 15 05:15:09.941046 systemd-logind[1606]: Removed session 15. Jul 15 05:15:14.949325 systemd[1]: Started sshd@13-139.178.70.102:22-147.75.109.163:39452.service - OpenSSH per-connection server daemon (147.75.109.163:39452). Jul 15 05:15:15.763627 sshd[6139]: Accepted publickey for core from 147.75.109.163 port 39452 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:15.764609 sshd-session[6139]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:15.777823 systemd-logind[1606]: New session 16 of user core. Jul 15 05:15:15.783992 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 15 05:15:16.413611 sshd[6142]: Connection closed by 147.75.109.163 port 39452 Jul 15 05:15:16.415714 sshd-session[6139]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:16.419559 systemd[1]: sshd@13-139.178.70.102:22-147.75.109.163:39452.service: Deactivated successfully. Jul 15 05:15:16.422460 systemd[1]: session-16.scope: Deactivated successfully. Jul 15 05:15:16.425613 systemd-logind[1606]: Session 16 logged out. Waiting for processes to exit. Jul 15 05:15:16.427214 systemd-logind[1606]: Removed session 16. Jul 15 05:15:21.424684 systemd[1]: Started sshd@14-139.178.70.102:22-147.75.109.163:43730.service - OpenSSH per-connection server daemon (147.75.109.163:43730). Jul 15 05:15:21.491169 sshd[6162]: Accepted publickey for core from 147.75.109.163 port 43730 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:21.492686 sshd-session[6162]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:21.495353 systemd-logind[1606]: New session 17 of user core. Jul 15 05:15:21.501043 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 15 05:15:21.761625 sshd[6165]: Connection closed by 147.75.109.163 port 43730 Jul 15 05:15:21.770852 sshd-session[6162]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:21.772180 systemd[1]: Started sshd@15-139.178.70.102:22-147.75.109.163:43734.service - OpenSSH per-connection server daemon (147.75.109.163:43734). Jul 15 05:15:21.781459 systemd[1]: sshd@14-139.178.70.102:22-147.75.109.163:43730.service: Deactivated successfully. Jul 15 05:15:21.782701 systemd[1]: session-17.scope: Deactivated successfully. Jul 15 05:15:21.791417 systemd-logind[1606]: Session 17 logged out. Waiting for processes to exit. Jul 15 05:15:21.794030 systemd-logind[1606]: Removed session 17. Jul 15 05:15:21.870765 sshd[6176]: Accepted publickey for core from 147.75.109.163 port 43734 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:21.871569 sshd-session[6176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:21.874594 systemd-logind[1606]: New session 18 of user core. Jul 15 05:15:21.882057 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 15 05:15:22.274617 sshd[6182]: Connection closed by 147.75.109.163 port 43734 Jul 15 05:15:22.275527 sshd-session[6176]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:22.289746 systemd[1]: sshd@15-139.178.70.102:22-147.75.109.163:43734.service: Deactivated successfully. Jul 15 05:15:22.292909 systemd[1]: session-18.scope: Deactivated successfully. Jul 15 05:15:22.296068 systemd-logind[1606]: Session 18 logged out. Waiting for processes to exit. Jul 15 05:15:22.298162 systemd[1]: Started sshd@16-139.178.70.102:22-147.75.109.163:43736.service - OpenSSH per-connection server daemon (147.75.109.163:43736). Jul 15 05:15:22.299433 systemd-logind[1606]: Removed session 18. Jul 15 05:15:22.669985 sshd[6192]: Accepted publickey for core from 147.75.109.163 port 43736 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:22.671788 sshd-session[6192]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:22.682063 systemd-logind[1606]: New session 19 of user core. Jul 15 05:15:22.687686 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 15 05:15:22.741134 kubelet[2951]: I0715 05:15:22.731780 2951 scope.go:117] "RemoveContainer" containerID="b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c" Jul 15 05:15:22.945881 containerd[1622]: time="2025-07-15T05:15:22.934658949Z" level=info msg="RemoveContainer for \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\"" Jul 15 05:15:23.156084 containerd[1622]: time="2025-07-15T05:15:23.155881459Z" level=info msg="RemoveContainer for \"b1c0842aedc21c9f9a0866a4e9ce74736b6d9161e081f05c710db1c6c1ba4c7c\" returns successfully" Jul 15 05:15:23.187244 kubelet[2951]: I0715 05:15:23.187165 2951 scope.go:117] "RemoveContainer" containerID="5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272" Jul 15 05:15:23.205753 containerd[1622]: time="2025-07-15T05:15:23.205681039Z" level=info msg="RemoveContainer for \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\"" Jul 15 05:15:23.635597 containerd[1622]: time="2025-07-15T05:15:23.635572179Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f9fed8105608ef656007c2a81c1790b36f8d642ed7e1d96d92ef9a3669b54c64\" id:\"e452227773aa19b974464dee2d873969108670f1904dfb042d214737c10952e7\" pid:6231 exited_at:{seconds:1752556523 nanos:521206324}" Jul 15 05:15:24.821985 containerd[1622]: time="2025-07-15T05:15:24.821936232Z" level=info msg="RemoveContainer for \"5fc3e6c3e2c07577a23fbf7c478d4d2bee6d45cb0603a76e0809a6b1e33cd272\" returns successfully" Jul 15 05:15:25.011828 containerd[1622]: time="2025-07-15T05:15:25.011799029Z" level=info msg="StopPodSandbox for \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\"" Jul 15 05:15:25.983941 containerd[1622]: time="2025-07-15T05:15:25.979083557Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30a59435d2bb70bc8869b4952278fc23db1834a92ae5cda2966072452c6e2311\" id:\"40700f47af762386cb0186251fad030c35d7f5aee5ea800d3c05972dc76629cb\" pid:6229 exited_at:{seconds:1752556525 nanos:708724243}" Jul 15 05:15:28.803278 sshd[6197]: Connection closed by 147.75.109.163 port 43736 Jul 15 05:15:28.880993 sshd-session[6192]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:28.946887 systemd[1]: sshd@16-139.178.70.102:22-147.75.109.163:43736.service: Deactivated successfully. Jul 15 05:15:28.952234 systemd[1]: session-19.scope: Deactivated successfully. Jul 15 05:15:28.952479 systemd[1]: session-19.scope: Consumed 448ms CPU time, 87.6M memory peak. Jul 15 05:15:28.955284 systemd-logind[1606]: Session 19 logged out. Waiting for processes to exit. Jul 15 05:15:28.988690 systemd[1]: Started sshd@17-139.178.70.102:22-147.75.109.163:55710.service - OpenSSH per-connection server daemon (147.75.109.163:55710). Jul 15 05:15:28.989392 systemd-logind[1606]: Removed session 19. Jul 15 05:15:29.502972 sshd[6318]: Accepted publickey for core from 147.75.109.163 port 55710 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:29.513555 sshd-session[6318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:29.530982 systemd-logind[1606]: New session 20 of user core. Jul 15 05:15:29.536107 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:28.504 [WARNING][6284] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:28.632 [INFO][6284] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:28.638 [INFO][6284] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" iface="eth0" netns="" Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:28.640 [INFO][6284] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:28.642 [INFO][6284] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:30.453 [INFO][6294] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:30.480 [INFO][6294] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:30.481 [INFO][6294] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:30.711 [WARNING][6294] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:30.711 [INFO][6294] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:30.740 [INFO][6294] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:30.898562 containerd[1622]: 2025-07-15 05:15:30.760 [INFO][6284] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:31.030852 containerd[1622]: time="2025-07-15T05:15:31.027512591Z" level=info msg="TearDown network for sandbox \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" successfully" Jul 15 05:15:31.032781 containerd[1622]: time="2025-07-15T05:15:31.032198746Z" level=info msg="StopPodSandbox for \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" returns successfully" Jul 15 05:15:32.069121 containerd[1622]: time="2025-07-15T05:15:32.069092112Z" level=info msg="RemovePodSandbox for \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\"" Jul 15 05:15:32.184497 containerd[1622]: time="2025-07-15T05:15:32.110255727Z" level=info msg="Forcibly stopping sandbox \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\"" Jul 15 05:15:36.337203 containerd[1622]: time="2025-07-15T05:15:36.237996013Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f31ef7aec4b28a73996fb73611f674898db45a804b4ec558f0aa5880a61ae998\" id:\"de0450762f0709e4fa42d2c3d3a919cb5b9558c069e486d06d5ab93a09e07bc2\" pid:6348 exited_at:{seconds:1752556535 nanos:903036407}" Jul 15 05:15:36.674230 kubelet[2951]: E0715 05:15:36.674175 2951 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="7.213s" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:35.299 [WARNING][6358] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:35.347 [INFO][6358] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:35.347 [INFO][6358] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" iface="eth0" netns="" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:35.350 [INFO][6358] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:35.350 [INFO][6358] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:36.625 [INFO][6383] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:36.659 [INFO][6383] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:36.664 [INFO][6383] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:36.733 [WARNING][6383] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:36.733 [INFO][6383] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" HandleID="k8s-pod-network.c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Workload="localhost-k8s-calico--apiserver--849d984976--9xsth-eth0" Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:36.738 [INFO][6383] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:36.768274 containerd[1622]: 2025-07-15 05:15:36.744 [INFO][6358] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458" Jul 15 05:15:36.777764 containerd[1622]: time="2025-07-15T05:15:36.775737985Z" level=info msg="TearDown network for sandbox \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" successfully" Jul 15 05:15:36.791055 containerd[1622]: time="2025-07-15T05:15:36.791023451Z" level=info msg="Ensure that sandbox c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458 in task-service has been cleanup successfully" Jul 15 05:15:36.841410 containerd[1622]: time="2025-07-15T05:15:36.841379198Z" level=info msg="RemovePodSandbox \"c2213fd498bddf1f17d8dee538f7eddbc46caf43cd3d2cdecdd2a168a0409458\" returns successfully" Jul 15 05:15:36.894752 containerd[1622]: time="2025-07-15T05:15:36.893754509Z" level=info msg="StopPodSandbox for \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\"" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.267 [WARNING][6410] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.268 [INFO][6410] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.268 [INFO][6410] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" iface="eth0" netns="" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.268 [INFO][6410] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.268 [INFO][6410] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.464 [INFO][6418] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.469 [INFO][6418] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.470 [INFO][6418] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.487 [WARNING][6418] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.487 [INFO][6418] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.490 [INFO][6418] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:37.500947 containerd[1622]: 2025-07-15 05:15:37.497 [INFO][6410] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:15:37.500947 containerd[1622]: time="2025-07-15T05:15:37.499398229Z" level=info msg="TearDown network for sandbox \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" successfully" Jul 15 05:15:37.500947 containerd[1622]: time="2025-07-15T05:15:37.499416891Z" level=info msg="StopPodSandbox for \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" returns successfully" Jul 15 05:15:37.514189 containerd[1622]: time="2025-07-15T05:15:37.514159410Z" level=info msg="RemovePodSandbox for \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\"" Jul 15 05:15:37.514231 containerd[1622]: time="2025-07-15T05:15:37.514191053Z" level=info msg="Forcibly stopping sandbox \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\"" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.603 [WARNING][6433] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" WorkloadEndpoint="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.603 [INFO][6433] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.603 [INFO][6433] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" iface="eth0" netns="" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.603 [INFO][6433] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.603 [INFO][6433] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.635 [INFO][6440] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.635 [INFO][6440] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.635 [INFO][6440] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.640 [WARNING][6440] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.640 [INFO][6440] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" HandleID="k8s-pod-network.962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Workload="localhost-k8s-calico--apiserver--849d984976--ml4gq-eth0" Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.642 [INFO][6440] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:37.650125 containerd[1622]: 2025-07-15 05:15:37.645 [INFO][6433] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912" Jul 15 05:15:37.650125 containerd[1622]: time="2025-07-15T05:15:37.649725697Z" level=info msg="TearDown network for sandbox \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" successfully" Jul 15 05:15:37.671892 containerd[1622]: time="2025-07-15T05:15:37.671793797Z" level=info msg="Ensure that sandbox 962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912 in task-service has been cleanup successfully" Jul 15 05:15:37.695624 containerd[1622]: time="2025-07-15T05:15:37.695580375Z" level=info msg="RemovePodSandbox \"962ca178abb4c460a9eb67ffbc5ee7c9d4deede42c16cf5902bfcdecc54d2912\" returns successfully" Jul 15 05:15:38.018843 sshd[6321]: Connection closed by 147.75.109.163 port 55710 Jul 15 05:15:38.037522 sshd-session[6318]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:38.124892 systemd[1]: sshd@17-139.178.70.102:22-147.75.109.163:55710.service: Deactivated successfully. Jul 15 05:15:38.127523 systemd[1]: session-20.scope: Deactivated successfully. Jul 15 05:15:38.128240 systemd[1]: session-20.scope: Consumed 1.381s CPU time, 60M memory peak. Jul 15 05:15:38.131666 systemd-logind[1606]: Session 20 logged out. Waiting for processes to exit. Jul 15 05:15:38.137859 systemd[1]: Started sshd@18-139.178.70.102:22-147.75.109.163:51634.service - OpenSSH per-connection server daemon (147.75.109.163:51634). Jul 15 05:15:38.140815 systemd-logind[1606]: Removed session 20. Jul 15 05:15:38.286007 sshd[6474]: Accepted publickey for core from 147.75.109.163 port 51634 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:38.288410 sshd-session[6474]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:38.295989 systemd-logind[1606]: New session 21 of user core. Jul 15 05:15:38.301083 systemd[1]: Started session-21.scope - Session 21 of User core. Jul 15 05:15:39.669616 sshd[6477]: Connection closed by 147.75.109.163 port 51634 Jul 15 05:15:39.672222 sshd-session[6474]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:39.676221 systemd-logind[1606]: Session 21 logged out. Waiting for processes to exit. Jul 15 05:15:39.677082 systemd[1]: sshd@18-139.178.70.102:22-147.75.109.163:51634.service: Deactivated successfully. Jul 15 05:15:39.680878 systemd[1]: session-21.scope: Deactivated successfully. Jul 15 05:15:39.683627 systemd-logind[1606]: Removed session 21. Jul 15 05:15:44.690444 systemd[1]: Started sshd@19-139.178.70.102:22-147.75.109.163:51646.service - OpenSSH per-connection server daemon (147.75.109.163:51646). Jul 15 05:15:44.838368 sshd[6492]: Accepted publickey for core from 147.75.109.163 port 51646 ssh2: RSA SHA256:3LzjwJYjOlsZVOOPKAIbtuWjgBjIxwy+KaRX41ju2nA Jul 15 05:15:44.860574 sshd-session[6492]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:44.875013 systemd-logind[1606]: New session 22 of user core. Jul 15 05:15:44.879004 systemd[1]: Started session-22.scope - Session 22 of User core. Jul 15 05:15:45.794258 sshd[6495]: Connection closed by 147.75.109.163 port 51646 Jul 15 05:15:45.795076 sshd-session[6492]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:45.798382 systemd-logind[1606]: Session 22 logged out. Waiting for processes to exit. Jul 15 05:15:45.798750 systemd[1]: sshd@19-139.178.70.102:22-147.75.109.163:51646.service: Deactivated successfully. Jul 15 05:15:45.800527 systemd[1]: session-22.scope: Deactivated successfully. Jul 15 05:15:45.802562 systemd-logind[1606]: Removed session 22.