Sep 12 00:34:04.705179 kernel: Linux version 6.12.46-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Sep 11 22:19:36 -00 2025 Sep 12 00:34:04.705196 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=04dc7b04a37dcb996cc4d6074b142179401d5685abf61ddcbaff7d77d0988990 Sep 12 00:34:04.705203 kernel: Disabled fast string operations Sep 12 00:34:04.705207 kernel: BIOS-provided physical RAM map: Sep 12 00:34:04.705211 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Sep 12 00:34:04.705216 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Sep 12 00:34:04.705221 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Sep 12 00:34:04.705225 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Sep 12 00:34:04.705230 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Sep 12 00:34:04.705234 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Sep 12 00:34:04.705238 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Sep 12 00:34:04.705242 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Sep 12 00:34:04.705247 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Sep 12 00:34:04.705251 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 12 00:34:04.705257 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Sep 12 00:34:04.705262 kernel: NX (Execute Disable) protection: active Sep 12 00:34:04.705267 kernel: APIC: Static calls initialized Sep 12 00:34:04.705272 kernel: SMBIOS 2.7 present. Sep 12 00:34:04.705277 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Sep 12 00:34:04.705282 kernel: DMI: Memory slots populated: 1/128 Sep 12 00:34:04.705288 kernel: vmware: hypercall mode: 0x00 Sep 12 00:34:04.705293 kernel: Hypervisor detected: VMware Sep 12 00:34:04.705297 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Sep 12 00:34:04.705302 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Sep 12 00:34:04.705307 kernel: vmware: using clock offset of 4480070537 ns Sep 12 00:34:04.705312 kernel: tsc: Detected 3408.000 MHz processor Sep 12 00:34:04.705317 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 00:34:04.705323 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 00:34:04.705327 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Sep 12 00:34:04.705332 kernel: total RAM covered: 3072M Sep 12 00:34:04.705338 kernel: Found optimal setting for mtrr clean up Sep 12 00:34:04.705346 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Sep 12 00:34:04.705351 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Sep 12 00:34:04.705356 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 00:34:04.705361 kernel: Using GB pages for direct mapping Sep 12 00:34:04.705366 kernel: ACPI: Early table checksum verification disabled Sep 12 00:34:04.705371 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Sep 12 00:34:04.705376 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Sep 12 00:34:04.705381 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Sep 12 00:34:04.705387 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Sep 12 00:34:04.705394 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 12 00:34:04.705399 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Sep 12 00:34:04.705404 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Sep 12 00:34:04.705409 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Sep 12 00:34:04.705415 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Sep 12 00:34:04.705421 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Sep 12 00:34:04.705426 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Sep 12 00:34:04.705431 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Sep 12 00:34:04.705436 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Sep 12 00:34:04.705442 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Sep 12 00:34:04.705447 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 12 00:34:04.705452 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Sep 12 00:34:04.705457 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Sep 12 00:34:04.705463 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Sep 12 00:34:04.705469 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Sep 12 00:34:04.705474 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Sep 12 00:34:04.705479 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Sep 12 00:34:04.705484 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Sep 12 00:34:04.705489 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 12 00:34:04.705494 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 12 00:34:04.705499 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Sep 12 00:34:04.705505 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Sep 12 00:34:04.705510 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Sep 12 00:34:04.705516 kernel: Zone ranges: Sep 12 00:34:04.705521 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 00:34:04.705526 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Sep 12 00:34:04.705532 kernel: Normal empty Sep 12 00:34:04.705537 kernel: Device empty Sep 12 00:34:04.705542 kernel: Movable zone start for each node Sep 12 00:34:04.705547 kernel: Early memory node ranges Sep 12 00:34:04.705552 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Sep 12 00:34:04.705557 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Sep 12 00:34:04.705563 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Sep 12 00:34:04.705569 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Sep 12 00:34:04.705574 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 00:34:04.705580 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Sep 12 00:34:04.705585 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Sep 12 00:34:04.705590 kernel: ACPI: PM-Timer IO Port: 0x1008 Sep 12 00:34:04.705595 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Sep 12 00:34:04.705600 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 12 00:34:04.705605 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 12 00:34:04.705610 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 12 00:34:04.705617 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 12 00:34:04.705622 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 12 00:34:04.705627 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 12 00:34:04.705632 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 12 00:34:04.705637 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 12 00:34:04.705642 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 12 00:34:04.705647 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 12 00:34:04.705653 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 12 00:34:04.705658 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 12 00:34:04.705664 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 12 00:34:04.705669 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 12 00:34:04.705674 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 12 00:34:04.705680 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 12 00:34:04.705685 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Sep 12 00:34:04.705690 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Sep 12 00:34:04.705695 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Sep 12 00:34:04.705700 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Sep 12 00:34:04.705705 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Sep 12 00:34:04.705710 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Sep 12 00:34:04.705717 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Sep 12 00:34:04.705722 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Sep 12 00:34:04.705727 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Sep 12 00:34:04.705732 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Sep 12 00:34:04.705737 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Sep 12 00:34:04.705742 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Sep 12 00:34:04.705747 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Sep 12 00:34:04.705752 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Sep 12 00:34:04.705758 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Sep 12 00:34:04.705763 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Sep 12 00:34:04.705769 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Sep 12 00:34:04.705775 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Sep 12 00:34:04.705779 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Sep 12 00:34:04.705784 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Sep 12 00:34:04.705789 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Sep 12 00:34:04.705794 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Sep 12 00:34:04.705800 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Sep 12 00:34:04.705809 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Sep 12 00:34:04.705815 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Sep 12 00:34:04.705820 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Sep 12 00:34:04.705826 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Sep 12 00:34:04.705832 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Sep 12 00:34:04.705837 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Sep 12 00:34:04.705843 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Sep 12 00:34:04.705849 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Sep 12 00:34:04.705854 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Sep 12 00:34:04.705859 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Sep 12 00:34:04.705865 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Sep 12 00:34:04.705871 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Sep 12 00:34:04.705877 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Sep 12 00:34:04.705882 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Sep 12 00:34:04.705887 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Sep 12 00:34:04.705893 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Sep 12 00:34:04.705898 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Sep 12 00:34:04.705903 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Sep 12 00:34:04.705909 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Sep 12 00:34:04.705915 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Sep 12 00:34:04.705920 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Sep 12 00:34:04.705927 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Sep 12 00:34:04.705932 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Sep 12 00:34:04.705938 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Sep 12 00:34:04.705943 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Sep 12 00:34:04.705948 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Sep 12 00:34:04.705954 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Sep 12 00:34:04.705959 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Sep 12 00:34:04.705965 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Sep 12 00:34:04.705970 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Sep 12 00:34:04.705975 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Sep 12 00:34:04.705982 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Sep 12 00:34:04.705988 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Sep 12 00:34:04.705993 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Sep 12 00:34:04.705999 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Sep 12 00:34:04.706004 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Sep 12 00:34:04.706009 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Sep 12 00:34:04.706015 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Sep 12 00:34:04.706020 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Sep 12 00:34:04.706025 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Sep 12 00:34:04.706032 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Sep 12 00:34:04.706038 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Sep 12 00:34:04.706043 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Sep 12 00:34:04.706049 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Sep 12 00:34:04.706054 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Sep 12 00:34:04.706059 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Sep 12 00:34:04.706065 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Sep 12 00:34:04.706070 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Sep 12 00:34:04.706075 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Sep 12 00:34:04.706124 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Sep 12 00:34:04.706132 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Sep 12 00:34:04.706138 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Sep 12 00:34:04.706144 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Sep 12 00:34:04.706149 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Sep 12 00:34:04.706154 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Sep 12 00:34:04.706160 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Sep 12 00:34:04.706165 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Sep 12 00:34:04.706170 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Sep 12 00:34:04.706176 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Sep 12 00:34:04.706181 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Sep 12 00:34:04.706188 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Sep 12 00:34:04.706194 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Sep 12 00:34:04.706199 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Sep 12 00:34:04.706204 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Sep 12 00:34:04.706210 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Sep 12 00:34:04.706215 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Sep 12 00:34:04.706221 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Sep 12 00:34:04.706226 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Sep 12 00:34:04.706232 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Sep 12 00:34:04.706237 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Sep 12 00:34:04.706244 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Sep 12 00:34:04.706249 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Sep 12 00:34:04.706255 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Sep 12 00:34:04.706260 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Sep 12 00:34:04.706265 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Sep 12 00:34:04.706271 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Sep 12 00:34:04.706276 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Sep 12 00:34:04.706281 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Sep 12 00:34:04.706287 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Sep 12 00:34:04.706292 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Sep 12 00:34:04.706299 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Sep 12 00:34:04.706305 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Sep 12 00:34:04.706311 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Sep 12 00:34:04.706316 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Sep 12 00:34:04.706322 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Sep 12 00:34:04.706327 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Sep 12 00:34:04.706333 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Sep 12 00:34:04.706338 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Sep 12 00:34:04.706344 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Sep 12 00:34:04.706350 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Sep 12 00:34:04.706356 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 00:34:04.706362 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Sep 12 00:34:04.706368 kernel: TSC deadline timer available Sep 12 00:34:04.706373 kernel: CPU topo: Max. logical packages: 128 Sep 12 00:34:04.706379 kernel: CPU topo: Max. logical dies: 128 Sep 12 00:34:04.706384 kernel: CPU topo: Max. dies per package: 1 Sep 12 00:34:04.706390 kernel: CPU topo: Max. threads per core: 1 Sep 12 00:34:04.706395 kernel: CPU topo: Num. cores per package: 1 Sep 12 00:34:04.706401 kernel: CPU topo: Num. threads per package: 1 Sep 12 00:34:04.706407 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Sep 12 00:34:04.706413 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Sep 12 00:34:04.706418 kernel: Booting paravirtualized kernel on VMware hypervisor Sep 12 00:34:04.706424 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 00:34:04.706430 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Sep 12 00:34:04.706435 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 12 00:34:04.706441 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 12 00:34:04.706447 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Sep 12 00:34:04.706452 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Sep 12 00:34:04.706459 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Sep 12 00:34:04.706464 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Sep 12 00:34:04.706470 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Sep 12 00:34:04.706475 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Sep 12 00:34:04.706480 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Sep 12 00:34:04.706486 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Sep 12 00:34:04.706491 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Sep 12 00:34:04.706497 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Sep 12 00:34:04.706502 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Sep 12 00:34:04.706509 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Sep 12 00:34:04.706515 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Sep 12 00:34:04.706520 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Sep 12 00:34:04.706525 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Sep 12 00:34:04.706531 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Sep 12 00:34:04.706537 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=04dc7b04a37dcb996cc4d6074b142179401d5685abf61ddcbaff7d77d0988990 Sep 12 00:34:04.706543 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 00:34:04.706550 kernel: random: crng init done Sep 12 00:34:04.706555 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Sep 12 00:34:04.706561 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Sep 12 00:34:04.706566 kernel: printk: log_buf_len min size: 262144 bytes Sep 12 00:34:04.706572 kernel: printk: log_buf_len: 1048576 bytes Sep 12 00:34:04.706583 kernel: printk: early log buf free: 245576(93%) Sep 12 00:34:04.706589 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 00:34:04.706594 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 00:34:04.706600 kernel: Fallback order for Node 0: 0 Sep 12 00:34:04.706606 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Sep 12 00:34:04.706613 kernel: Policy zone: DMA32 Sep 12 00:34:04.706618 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 00:34:04.706624 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Sep 12 00:34:04.706629 kernel: ftrace: allocating 40120 entries in 157 pages Sep 12 00:34:04.706635 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 00:34:04.706640 kernel: Dynamic Preempt: voluntary Sep 12 00:34:04.706646 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 00:34:04.706651 kernel: rcu: RCU event tracing is enabled. Sep 12 00:34:04.706657 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Sep 12 00:34:04.706664 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 00:34:04.706670 kernel: Rude variant of Tasks RCU enabled. Sep 12 00:34:04.706676 kernel: Tracing variant of Tasks RCU enabled. Sep 12 00:34:04.706681 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 00:34:04.706687 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Sep 12 00:34:04.706692 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 00:34:04.706698 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 00:34:04.706704 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Sep 12 00:34:04.706709 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Sep 12 00:34:04.706716 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Sep 12 00:34:04.706721 kernel: Console: colour VGA+ 80x25 Sep 12 00:34:04.706727 kernel: printk: legacy console [tty0] enabled Sep 12 00:34:04.706732 kernel: printk: legacy console [ttyS0] enabled Sep 12 00:34:04.706738 kernel: ACPI: Core revision 20240827 Sep 12 00:34:04.706744 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Sep 12 00:34:04.706750 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 00:34:04.706755 kernel: x2apic enabled Sep 12 00:34:04.706761 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 00:34:04.706767 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 00:34:04.706773 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 12 00:34:04.706779 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Sep 12 00:34:04.706785 kernel: Disabled fast string operations Sep 12 00:34:04.706791 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 12 00:34:04.706796 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 12 00:34:04.706802 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 00:34:04.706807 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 12 00:34:04.706813 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 12 00:34:04.706819 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 12 00:34:04.706825 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 12 00:34:04.706831 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 00:34:04.706836 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 00:34:04.706842 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 12 00:34:04.706847 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 12 00:34:04.706853 kernel: GDS: Unknown: Dependent on hypervisor status Sep 12 00:34:04.706858 kernel: active return thunk: its_return_thunk Sep 12 00:34:04.706864 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 12 00:34:04.706871 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 00:34:04.706876 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 00:34:04.706882 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 00:34:04.706888 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 00:34:04.706893 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 00:34:04.706899 kernel: Freeing SMP alternatives memory: 32K Sep 12 00:34:04.706904 kernel: pid_max: default: 131072 minimum: 1024 Sep 12 00:34:04.706910 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 00:34:04.706916 kernel: landlock: Up and running. Sep 12 00:34:04.706922 kernel: SELinux: Initializing. Sep 12 00:34:04.706928 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 00:34:04.706933 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 00:34:04.706939 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 12 00:34:04.706944 kernel: Performance Events: Skylake events, core PMU driver. Sep 12 00:34:04.706950 kernel: core: CPUID marked event: 'cpu cycles' unavailable Sep 12 00:34:04.706956 kernel: core: CPUID marked event: 'instructions' unavailable Sep 12 00:34:04.706961 kernel: core: CPUID marked event: 'bus cycles' unavailable Sep 12 00:34:04.706967 kernel: core: CPUID marked event: 'cache references' unavailable Sep 12 00:34:04.706973 kernel: core: CPUID marked event: 'cache misses' unavailable Sep 12 00:34:04.706978 kernel: core: CPUID marked event: 'branch instructions' unavailable Sep 12 00:34:04.706984 kernel: core: CPUID marked event: 'branch misses' unavailable Sep 12 00:34:04.706989 kernel: ... version: 1 Sep 12 00:34:04.706995 kernel: ... bit width: 48 Sep 12 00:34:04.707000 kernel: ... generic registers: 4 Sep 12 00:34:04.707006 kernel: ... value mask: 0000ffffffffffff Sep 12 00:34:04.707012 kernel: ... max period: 000000007fffffff Sep 12 00:34:04.707017 kernel: ... fixed-purpose events: 0 Sep 12 00:34:04.707024 kernel: ... event mask: 000000000000000f Sep 12 00:34:04.707030 kernel: signal: max sigframe size: 1776 Sep 12 00:34:04.707036 kernel: rcu: Hierarchical SRCU implementation. Sep 12 00:34:04.707041 kernel: rcu: Max phase no-delay instances is 400. Sep 12 00:34:04.707047 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Sep 12 00:34:04.707053 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 12 00:34:04.707059 kernel: smp: Bringing up secondary CPUs ... Sep 12 00:34:04.707064 kernel: smpboot: x86: Booting SMP configuration: Sep 12 00:34:04.707070 kernel: .... node #0, CPUs: #1 Sep 12 00:34:04.707077 kernel: Disabled fast string operations Sep 12 00:34:04.707099 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 00:34:04.707105 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Sep 12 00:34:04.707111 kernel: Memory: 1926292K/2096628K available (14336K kernel code, 2432K rwdata, 9960K rodata, 53836K init, 1080K bss, 158960K reserved, 0K cma-reserved) Sep 12 00:34:04.707117 kernel: devtmpfs: initialized Sep 12 00:34:04.707122 kernel: x86/mm: Memory block size: 128MB Sep 12 00:34:04.707128 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Sep 12 00:34:04.707134 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 00:34:04.707139 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Sep 12 00:34:04.707147 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 00:34:04.707152 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 00:34:04.707158 kernel: audit: initializing netlink subsys (disabled) Sep 12 00:34:04.707164 kernel: audit: type=2000 audit(1757637241.280:1): state=initialized audit_enabled=0 res=1 Sep 12 00:34:04.707169 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 00:34:04.707175 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 00:34:04.707180 kernel: cpuidle: using governor menu Sep 12 00:34:04.707186 kernel: Simple Boot Flag at 0x36 set to 0x80 Sep 12 00:34:04.707191 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 00:34:04.707198 kernel: dca service started, version 1.12.1 Sep 12 00:34:04.707211 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Sep 12 00:34:04.707218 kernel: PCI: Using configuration type 1 for base access Sep 12 00:34:04.707224 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 00:34:04.707230 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 00:34:04.707236 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 00:34:04.707242 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 00:34:04.707248 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 00:34:04.707254 kernel: ACPI: Added _OSI(Module Device) Sep 12 00:34:04.707261 kernel: ACPI: Added _OSI(Processor Device) Sep 12 00:34:04.707268 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 00:34:04.707274 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 00:34:04.707279 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 12 00:34:04.707285 kernel: ACPI: Interpreter enabled Sep 12 00:34:04.707291 kernel: ACPI: PM: (supports S0 S1 S5) Sep 12 00:34:04.707298 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 00:34:04.707304 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 00:34:04.707310 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 00:34:04.707317 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Sep 12 00:34:04.707323 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Sep 12 00:34:04.707419 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 00:34:04.707477 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Sep 12 00:34:04.707526 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Sep 12 00:34:04.707535 kernel: PCI host bridge to bus 0000:00 Sep 12 00:34:04.707589 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 00:34:04.707638 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Sep 12 00:34:04.707681 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 12 00:34:04.707724 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 00:34:04.707767 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Sep 12 00:34:04.707810 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Sep 12 00:34:04.707869 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Sep 12 00:34:04.707930 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Sep 12 00:34:04.707981 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 00:34:04.708037 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 12 00:34:04.708266 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Sep 12 00:34:04.708324 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Sep 12 00:34:04.708374 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Sep 12 00:34:04.708424 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Sep 12 00:34:04.708473 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Sep 12 00:34:04.708522 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Sep 12 00:34:04.708576 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 12 00:34:04.708634 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Sep 12 00:34:04.708683 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Sep 12 00:34:04.708738 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Sep 12 00:34:04.708789 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Sep 12 00:34:04.708838 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Sep 12 00:34:04.708892 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Sep 12 00:34:04.708941 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Sep 12 00:34:04.708993 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Sep 12 00:34:04.709041 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Sep 12 00:34:04.709098 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Sep 12 00:34:04.709148 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 00:34:04.709203 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Sep 12 00:34:04.709509 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 12 00:34:04.709562 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 12 00:34:04.709623 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 12 00:34:04.709673 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 00:34:04.709731 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.709783 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 00:34:04.709833 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 12 00:34:04.709885 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 12 00:34:04.709936 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.709992 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.710043 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 00:34:04.710104 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 12 00:34:04.710155 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 12 00:34:04.710206 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 00:34:04.710255 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.710310 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.710364 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 00:34:04.710414 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 12 00:34:04.710464 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 12 00:34:04.710513 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 00:34:04.710564 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.710620 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.710675 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 00:34:04.710725 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 12 00:34:04.710777 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 00:34:04.710828 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.710883 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.710935 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 00:34:04.710986 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 12 00:34:04.711038 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 00:34:04.711103 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.711160 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.711210 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 00:34:04.711259 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 12 00:34:04.711308 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 00:34:04.711357 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.711416 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.711467 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 00:34:04.711516 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 12 00:34:04.711566 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 00:34:04.711615 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.711669 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.711719 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 00:34:04.711771 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 12 00:34:04.711821 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 00:34:04.711871 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.711927 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.711978 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 00:34:04.712027 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 12 00:34:04.712089 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 12 00:34:04.712155 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.712215 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.712267 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 00:34:04.712316 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 12 00:34:04.712366 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 12 00:34:04.712415 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 00:34:04.712464 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.712518 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.712572 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 00:34:04.713121 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 12 00:34:04.713185 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 12 00:34:04.713239 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 00:34:04.713291 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.713348 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.713408 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 00:34:04.713470 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 12 00:34:04.713521 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 00:34:04.713571 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.713629 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.713680 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 00:34:04.713730 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 12 00:34:04.713779 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 00:34:04.713831 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.713885 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.713937 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 00:34:04.713986 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 12 00:34:04.714035 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 00:34:04.715107 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.715176 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.715236 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 00:34:04.715288 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 12 00:34:04.715339 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 00:34:04.715390 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.715445 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.715496 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 00:34:04.715547 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 12 00:34:04.715604 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 00:34:04.715654 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.715711 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.715762 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 00:34:04.715812 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 12 00:34:04.715863 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 12 00:34:04.715914 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 00:34:04.715966 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.716022 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.716074 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 00:34:04.716144 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 12 00:34:04.716198 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 12 00:34:04.716248 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 00:34:04.716299 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.716354 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.716406 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 00:34:04.716456 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 12 00:34:04.716507 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 12 00:34:04.716560 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 00:34:04.716616 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.716672 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.716723 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 00:34:04.716773 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 12 00:34:04.716823 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 00:34:04.716873 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.716933 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.716990 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 00:34:04.717044 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 12 00:34:04.717746 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 00:34:04.717809 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.717874 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.717928 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 00:34:04.717982 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 12 00:34:04.718033 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 00:34:04.718108 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.718169 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.718222 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 00:34:04.718272 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 12 00:34:04.718324 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 00:34:04.718378 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.718436 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.718488 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 00:34:04.718539 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 12 00:34:04.718597 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 00:34:04.718648 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.718703 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.718757 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 00:34:04.718809 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 12 00:34:04.718860 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 12 00:34:04.718911 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 00:34:04.718961 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.719020 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.719073 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 00:34:04.719517 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 12 00:34:04.719571 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 12 00:34:04.719623 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 00:34:04.719720 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.719779 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.719831 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 00:34:04.719882 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 12 00:34:04.719935 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 00:34:04.719985 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.720040 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.720114 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 00:34:04.720169 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 12 00:34:04.720221 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 00:34:04.720271 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.721069 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.721154 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 00:34:04.721219 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 12 00:34:04.721286 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 00:34:04.721354 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.721448 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.721510 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 00:34:04.721562 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 12 00:34:04.721623 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 00:34:04.721673 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.721729 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.721781 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 00:34:04.721832 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 12 00:34:04.721881 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 00:34:04.721949 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.722011 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Sep 12 00:34:04.722758 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 00:34:04.723395 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 12 00:34:04.723456 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 00:34:04.723542 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.723602 kernel: pci_bus 0000:01: extended config space not accessible Sep 12 00:34:04.723668 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 00:34:04.723732 kernel: pci_bus 0000:02: extended config space not accessible Sep 12 00:34:04.723746 kernel: acpiphp: Slot [32] registered Sep 12 00:34:04.723752 kernel: acpiphp: Slot [33] registered Sep 12 00:34:04.723758 kernel: acpiphp: Slot [34] registered Sep 12 00:34:04.723764 kernel: acpiphp: Slot [35] registered Sep 12 00:34:04.723770 kernel: acpiphp: Slot [36] registered Sep 12 00:34:04.723775 kernel: acpiphp: Slot [37] registered Sep 12 00:34:04.723782 kernel: acpiphp: Slot [38] registered Sep 12 00:34:04.723787 kernel: acpiphp: Slot [39] registered Sep 12 00:34:04.723795 kernel: acpiphp: Slot [40] registered Sep 12 00:34:04.723801 kernel: acpiphp: Slot [41] registered Sep 12 00:34:04.723807 kernel: acpiphp: Slot [42] registered Sep 12 00:34:04.723813 kernel: acpiphp: Slot [43] registered Sep 12 00:34:04.723819 kernel: acpiphp: Slot [44] registered Sep 12 00:34:04.723825 kernel: acpiphp: Slot [45] registered Sep 12 00:34:04.723831 kernel: acpiphp: Slot [46] registered Sep 12 00:34:04.723837 kernel: acpiphp: Slot [47] registered Sep 12 00:34:04.723843 kernel: acpiphp: Slot [48] registered Sep 12 00:34:04.723849 kernel: acpiphp: Slot [49] registered Sep 12 00:34:04.723856 kernel: acpiphp: Slot [50] registered Sep 12 00:34:04.723862 kernel: acpiphp: Slot [51] registered Sep 12 00:34:04.723868 kernel: acpiphp: Slot [52] registered Sep 12 00:34:04.723873 kernel: acpiphp: Slot [53] registered Sep 12 00:34:04.723879 kernel: acpiphp: Slot [54] registered Sep 12 00:34:04.723885 kernel: acpiphp: Slot [55] registered Sep 12 00:34:04.723891 kernel: acpiphp: Slot [56] registered Sep 12 00:34:04.723897 kernel: acpiphp: Slot [57] registered Sep 12 00:34:04.723903 kernel: acpiphp: Slot [58] registered Sep 12 00:34:04.723910 kernel: acpiphp: Slot [59] registered Sep 12 00:34:04.723916 kernel: acpiphp: Slot [60] registered Sep 12 00:34:04.723922 kernel: acpiphp: Slot [61] registered Sep 12 00:34:04.723928 kernel: acpiphp: Slot [62] registered Sep 12 00:34:04.723933 kernel: acpiphp: Slot [63] registered Sep 12 00:34:04.723986 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Sep 12 00:34:04.724037 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Sep 12 00:34:04.725123 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Sep 12 00:34:04.725187 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Sep 12 00:34:04.725247 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Sep 12 00:34:04.725303 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Sep 12 00:34:04.725369 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Sep 12 00:34:04.725436 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Sep 12 00:34:04.725489 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Sep 12 00:34:04.725542 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 12 00:34:04.725599 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 12 00:34:04.725654 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 12 00:34:04.725707 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 00:34:04.725760 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 00:34:04.725820 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 00:34:04.725874 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 00:34:04.725937 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 00:34:04.726003 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 00:34:04.726060 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 00:34:04.726167 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 00:34:04.726227 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Sep 12 00:34:04.726280 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Sep 12 00:34:04.726331 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Sep 12 00:34:04.726383 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Sep 12 00:34:04.726433 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Sep 12 00:34:04.726487 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Sep 12 00:34:04.726538 kernel: pci 0000:0b:00.0: supports D1 D2 Sep 12 00:34:04.726589 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 00:34:04.726640 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Sep 12 00:34:04.726691 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 00:34:04.726744 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 00:34:04.726797 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 00:34:04.726849 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 00:34:04.726904 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 00:34:04.726956 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 00:34:04.727009 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 00:34:04.727061 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 00:34:04.727123 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 00:34:04.727176 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 00:34:04.727229 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 00:34:04.727282 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 00:34:04.727337 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 00:34:04.727390 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 00:34:04.727444 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 00:34:04.727498 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 00:34:04.727551 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 00:34:04.727604 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 00:34:04.727657 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 00:34:04.727712 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 00:34:04.727763 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 00:34:04.727817 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 00:34:04.727869 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 00:34:04.727921 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 00:34:04.727930 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Sep 12 00:34:04.727936 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Sep 12 00:34:04.727944 kernel: ACPI: PCI: Interrupt link LNKB disabled Sep 12 00:34:04.727951 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 00:34:04.727957 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Sep 12 00:34:04.727963 kernel: iommu: Default domain type: Translated Sep 12 00:34:04.727968 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 00:34:04.727974 kernel: PCI: Using ACPI for IRQ routing Sep 12 00:34:04.727980 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 00:34:04.727986 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Sep 12 00:34:04.727992 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Sep 12 00:34:04.728043 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Sep 12 00:34:04.731144 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Sep 12 00:34:04.731227 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 00:34:04.731237 kernel: vgaarb: loaded Sep 12 00:34:04.731244 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Sep 12 00:34:04.731250 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Sep 12 00:34:04.731256 kernel: clocksource: Switched to clocksource tsc-early Sep 12 00:34:04.731263 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 00:34:04.731269 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 00:34:04.731278 kernel: pnp: PnP ACPI init Sep 12 00:34:04.731336 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Sep 12 00:34:04.731387 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Sep 12 00:34:04.731432 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Sep 12 00:34:04.731480 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Sep 12 00:34:04.731530 kernel: pnp 00:06: [dma 2] Sep 12 00:34:04.731587 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Sep 12 00:34:04.731636 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Sep 12 00:34:04.731681 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Sep 12 00:34:04.731689 kernel: pnp: PnP ACPI: found 8 devices Sep 12 00:34:04.731696 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 00:34:04.731702 kernel: NET: Registered PF_INET protocol family Sep 12 00:34:04.731708 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 00:34:04.731714 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 12 00:34:04.731722 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 00:34:04.731728 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 00:34:04.731734 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 00:34:04.731741 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 12 00:34:04.731747 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 00:34:04.731753 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 00:34:04.731759 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 00:34:04.731765 kernel: NET: Registered PF_XDP protocol family Sep 12 00:34:04.731819 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Sep 12 00:34:04.731876 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 12 00:34:04.731929 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 00:34:04.731983 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 00:34:04.732035 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 00:34:04.732097 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Sep 12 00:34:04.732153 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Sep 12 00:34:04.732205 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Sep 12 00:34:04.732258 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Sep 12 00:34:04.732313 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Sep 12 00:34:04.732366 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Sep 12 00:34:04.732419 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Sep 12 00:34:04.732472 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Sep 12 00:34:04.732525 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Sep 12 00:34:04.732586 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Sep 12 00:34:04.732644 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Sep 12 00:34:04.732696 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Sep 12 00:34:04.732751 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Sep 12 00:34:04.732804 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Sep 12 00:34:04.732856 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Sep 12 00:34:04.732909 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Sep 12 00:34:04.732961 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Sep 12 00:34:04.733012 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Sep 12 00:34:04.733064 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Sep 12 00:34:04.733636 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Sep 12 00:34:04.733695 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.733748 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.733802 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.733852 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.733903 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.733953 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734004 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734058 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734130 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734182 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734233 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734284 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734335 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734385 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734437 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734490 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734542 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734592 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734644 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734695 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734747 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734797 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734849 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.734902 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.734953 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.735002 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.735055 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.735662 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.735722 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.735775 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.735836 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.735904 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.735969 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736023 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736075 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736186 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736240 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736292 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736347 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736398 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736449 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736501 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736550 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736606 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736657 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736708 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736759 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736813 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736864 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.736914 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.736963 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737013 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737064 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737122 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737174 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737225 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737279 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737330 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737382 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737448 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737502 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737552 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737606 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737656 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737708 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737759 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737815 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737865 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.737916 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.737967 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.738019 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.738069 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.738144 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.738196 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.738252 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.738304 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.738357 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.738419 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.738491 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.738545 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.738603 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.738665 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.738719 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Sep 12 00:34:04.738770 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Sep 12 00:34:04.738823 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 00:34:04.738875 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Sep 12 00:34:04.738925 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Sep 12 00:34:04.738974 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Sep 12 00:34:04.739024 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 00:34:04.739092 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Sep 12 00:34:04.739155 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Sep 12 00:34:04.739205 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Sep 12 00:34:04.739256 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Sep 12 00:34:04.739306 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Sep 12 00:34:04.739358 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Sep 12 00:34:04.739408 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Sep 12 00:34:04.739459 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Sep 12 00:34:04.739509 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 00:34:04.739560 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Sep 12 00:34:04.739615 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Sep 12 00:34:04.739665 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Sep 12 00:34:04.739714 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 00:34:04.739765 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Sep 12 00:34:04.739815 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Sep 12 00:34:04.739866 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 00:34:04.739917 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Sep 12 00:34:04.739967 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Sep 12 00:34:04.740018 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 00:34:04.740072 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Sep 12 00:34:04.740144 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Sep 12 00:34:04.740195 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 00:34:04.740248 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Sep 12 00:34:04.740300 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Sep 12 00:34:04.740350 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 00:34:04.740405 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Sep 12 00:34:04.740455 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Sep 12 00:34:04.740505 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 00:34:04.740564 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Sep 12 00:34:04.740622 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Sep 12 00:34:04.740673 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Sep 12 00:34:04.740724 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Sep 12 00:34:04.740774 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Sep 12 00:34:04.740826 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Sep 12 00:34:04.740881 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Sep 12 00:34:04.740931 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Sep 12 00:34:04.740982 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 00:34:04.741036 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Sep 12 00:34:04.741104 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Sep 12 00:34:04.741158 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Sep 12 00:34:04.741222 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 00:34:04.741279 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Sep 12 00:34:04.741338 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Sep 12 00:34:04.741389 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 00:34:04.741445 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Sep 12 00:34:04.741495 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Sep 12 00:34:04.741545 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 00:34:04.741602 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Sep 12 00:34:04.741652 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Sep 12 00:34:04.741702 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 00:34:04.741757 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Sep 12 00:34:04.741807 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Sep 12 00:34:04.741857 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 00:34:04.741909 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Sep 12 00:34:04.741958 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Sep 12 00:34:04.742008 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 00:34:04.742059 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Sep 12 00:34:04.742134 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Sep 12 00:34:04.742189 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Sep 12 00:34:04.742240 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 00:34:04.742291 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Sep 12 00:34:04.742342 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Sep 12 00:34:04.742396 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Sep 12 00:34:04.742456 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 00:34:04.742508 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Sep 12 00:34:04.742558 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Sep 12 00:34:04.742621 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Sep 12 00:34:04.742671 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 00:34:04.742726 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Sep 12 00:34:04.742776 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Sep 12 00:34:04.742827 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 00:34:04.742879 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Sep 12 00:34:04.742930 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Sep 12 00:34:04.742979 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 00:34:04.743030 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Sep 12 00:34:04.743091 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Sep 12 00:34:04.743142 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 00:34:04.743203 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Sep 12 00:34:04.743269 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Sep 12 00:34:04.743329 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 00:34:04.743391 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Sep 12 00:34:04.743460 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Sep 12 00:34:04.743530 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 00:34:04.743602 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Sep 12 00:34:04.743671 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Sep 12 00:34:04.743731 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Sep 12 00:34:04.743782 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 00:34:04.743834 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Sep 12 00:34:04.743885 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Sep 12 00:34:04.743935 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Sep 12 00:34:04.743985 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 00:34:04.744037 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Sep 12 00:34:04.744107 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Sep 12 00:34:04.744160 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 00:34:04.744213 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Sep 12 00:34:04.744263 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Sep 12 00:34:04.744313 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 00:34:04.744366 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Sep 12 00:34:04.744417 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Sep 12 00:34:04.744469 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 00:34:04.744523 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Sep 12 00:34:04.744572 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Sep 12 00:34:04.744633 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 00:34:04.744689 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Sep 12 00:34:04.744739 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Sep 12 00:34:04.744799 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 00:34:04.744855 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Sep 12 00:34:04.744906 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Sep 12 00:34:04.744956 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 00:34:04.745007 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Sep 12 00:34:04.745052 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 12 00:34:04.745106 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 12 00:34:04.745153 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Sep 12 00:34:04.745197 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Sep 12 00:34:04.745247 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Sep 12 00:34:04.745293 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Sep 12 00:34:04.745339 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Sep 12 00:34:04.745384 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Sep 12 00:34:04.745430 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Sep 12 00:34:04.745479 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Sep 12 00:34:04.745526 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Sep 12 00:34:04.745579 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Sep 12 00:34:04.745633 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Sep 12 00:34:04.745681 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Sep 12 00:34:04.745726 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Sep 12 00:34:04.745777 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Sep 12 00:34:04.745825 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Sep 12 00:34:04.745870 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Sep 12 00:34:04.745923 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Sep 12 00:34:04.745969 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Sep 12 00:34:04.746014 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Sep 12 00:34:04.746065 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Sep 12 00:34:04.746128 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Sep 12 00:34:04.746182 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Sep 12 00:34:04.746232 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Sep 12 00:34:04.746282 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Sep 12 00:34:04.746328 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Sep 12 00:34:04.746377 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Sep 12 00:34:04.746436 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Sep 12 00:34:04.746487 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Sep 12 00:34:04.746536 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Sep 12 00:34:04.746585 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Sep 12 00:34:04.746630 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Sep 12 00:34:04.746675 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Sep 12 00:34:04.746727 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Sep 12 00:34:04.746772 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Sep 12 00:34:04.746819 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Sep 12 00:34:04.746868 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Sep 12 00:34:04.746914 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Sep 12 00:34:04.746959 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Sep 12 00:34:04.747009 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Sep 12 00:34:04.747055 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Sep 12 00:34:04.747131 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Sep 12 00:34:04.747181 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Sep 12 00:34:04.747234 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Sep 12 00:34:04.747281 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Sep 12 00:34:04.747330 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Sep 12 00:34:04.747376 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Sep 12 00:34:04.747440 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Sep 12 00:34:04.747490 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Sep 12 00:34:04.747540 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Sep 12 00:34:04.747586 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Sep 12 00:34:04.747631 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Sep 12 00:34:04.747681 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Sep 12 00:34:04.747726 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Sep 12 00:34:04.747774 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Sep 12 00:34:04.747823 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Sep 12 00:34:04.747868 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Sep 12 00:34:04.747913 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Sep 12 00:34:04.747962 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Sep 12 00:34:04.748008 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Sep 12 00:34:04.748058 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Sep 12 00:34:04.748116 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Sep 12 00:34:04.748166 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Sep 12 00:34:04.748211 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Sep 12 00:34:04.748262 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Sep 12 00:34:04.748308 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Sep 12 00:34:04.748357 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Sep 12 00:34:04.748413 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Sep 12 00:34:04.748463 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Sep 12 00:34:04.748509 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Sep 12 00:34:04.748554 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Sep 12 00:34:04.748608 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Sep 12 00:34:04.748653 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Sep 12 00:34:04.748698 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Sep 12 00:34:04.748752 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Sep 12 00:34:04.748798 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Sep 12 00:34:04.748847 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Sep 12 00:34:04.748893 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Sep 12 00:34:04.748942 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Sep 12 00:34:04.748987 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Sep 12 00:34:04.749039 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Sep 12 00:34:04.751326 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Sep 12 00:34:04.751398 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Sep 12 00:34:04.751454 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Sep 12 00:34:04.751507 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Sep 12 00:34:04.751553 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Sep 12 00:34:04.751624 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 12 00:34:04.751634 kernel: PCI: CLS 32 bytes, default 64 Sep 12 00:34:04.751641 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 12 00:34:04.751648 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 12 00:34:04.751654 kernel: clocksource: Switched to clocksource tsc Sep 12 00:34:04.751660 kernel: Initialise system trusted keyrings Sep 12 00:34:04.751666 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 12 00:34:04.751673 kernel: Key type asymmetric registered Sep 12 00:34:04.751679 kernel: Asymmetric key parser 'x509' registered Sep 12 00:34:04.751687 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 00:34:04.751693 kernel: io scheduler mq-deadline registered Sep 12 00:34:04.751699 kernel: io scheduler kyber registered Sep 12 00:34:04.751706 kernel: io scheduler bfq registered Sep 12 00:34:04.751758 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Sep 12 00:34:04.751810 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.751861 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Sep 12 00:34:04.751914 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.751965 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Sep 12 00:34:04.752016 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752068 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Sep 12 00:34:04.752127 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752179 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Sep 12 00:34:04.752230 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752282 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Sep 12 00:34:04.752335 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752391 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Sep 12 00:34:04.752444 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752495 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Sep 12 00:34:04.752546 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752614 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Sep 12 00:34:04.752666 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752721 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Sep 12 00:34:04.752771 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752823 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Sep 12 00:34:04.752873 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.752926 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Sep 12 00:34:04.752985 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.753039 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Sep 12 00:34:04.754147 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.754210 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Sep 12 00:34:04.754264 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.754319 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Sep 12 00:34:04.754373 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.754436 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Sep 12 00:34:04.754487 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.754540 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Sep 12 00:34:04.754594 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.754646 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Sep 12 00:34:04.754697 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.754749 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Sep 12 00:34:04.754800 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.754852 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Sep 12 00:34:04.754903 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.754957 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Sep 12 00:34:04.755008 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.755061 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Sep 12 00:34:04.756596 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.756656 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Sep 12 00:34:04.756711 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.756764 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Sep 12 00:34:04.756816 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.756871 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Sep 12 00:34:04.756922 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.756974 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Sep 12 00:34:04.757024 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.757077 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Sep 12 00:34:04.757161 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.757220 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Sep 12 00:34:04.757276 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.757328 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Sep 12 00:34:04.757379 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.757430 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Sep 12 00:34:04.757483 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.757534 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Sep 12 00:34:04.757585 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.757636 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Sep 12 00:34:04.757690 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Sep 12 00:34:04.757701 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 00:34:04.757708 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 00:34:04.757714 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 00:34:04.757721 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Sep 12 00:34:04.757727 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 00:34:04.757734 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 00:34:04.757786 kernel: rtc_cmos 00:01: registered as rtc0 Sep 12 00:34:04.757834 kernel: rtc_cmos 00:01: setting system clock to 2025-09-12T00:34:04 UTC (1757637244) Sep 12 00:34:04.757878 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Sep 12 00:34:04.757887 kernel: intel_pstate: CPU model not supported Sep 12 00:34:04.757894 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Sep 12 00:34:04.757900 kernel: NET: Registered PF_INET6 protocol family Sep 12 00:34:04.757907 kernel: Segment Routing with IPv6 Sep 12 00:34:04.757913 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 00:34:04.757922 kernel: NET: Registered PF_PACKET protocol family Sep 12 00:34:04.757929 kernel: Key type dns_resolver registered Sep 12 00:34:04.757935 kernel: IPI shorthand broadcast: enabled Sep 12 00:34:04.757941 kernel: sched_clock: Marking stable (2760003402, 169767403)->(2942954814, -13184009) Sep 12 00:34:04.757948 kernel: registered taskstats version 1 Sep 12 00:34:04.757954 kernel: Loading compiled-in X.509 certificates Sep 12 00:34:04.757961 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.46-flatcar: 652e453facea91af3a07ba1d2bcc346a615f1cf9' Sep 12 00:34:04.757967 kernel: Demotion targets for Node 0: null Sep 12 00:34:04.757973 kernel: Key type .fscrypt registered Sep 12 00:34:04.757981 kernel: Key type fscrypt-provisioning registered Sep 12 00:34:04.757987 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 00:34:04.757994 kernel: ima: Allocated hash algorithm: sha1 Sep 12 00:34:04.758000 kernel: ima: No architecture policies found Sep 12 00:34:04.758006 kernel: clk: Disabling unused clocks Sep 12 00:34:04.758013 kernel: Warning: unable to open an initial console. Sep 12 00:34:04.758019 kernel: Freeing unused kernel image (initmem) memory: 53836K Sep 12 00:34:04.758025 kernel: Write protecting the kernel read-only data: 24576k Sep 12 00:34:04.758032 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 12 00:34:04.758039 kernel: Run /init as init process Sep 12 00:34:04.758046 kernel: with arguments: Sep 12 00:34:04.758052 kernel: /init Sep 12 00:34:04.758058 kernel: with environment: Sep 12 00:34:04.758064 kernel: HOME=/ Sep 12 00:34:04.758070 kernel: TERM=linux Sep 12 00:34:04.758076 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 00:34:04.758100 systemd[1]: Successfully made /usr/ read-only. Sep 12 00:34:04.758110 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 00:34:04.758119 systemd[1]: Detected virtualization vmware. Sep 12 00:34:04.758125 systemd[1]: Detected architecture x86-64. Sep 12 00:34:04.758131 systemd[1]: Running in initrd. Sep 12 00:34:04.758137 systemd[1]: No hostname configured, using default hostname. Sep 12 00:34:04.758144 systemd[1]: Hostname set to . Sep 12 00:34:04.758151 systemd[1]: Initializing machine ID from random generator. Sep 12 00:34:04.758157 systemd[1]: Queued start job for default target initrd.target. Sep 12 00:34:04.758165 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 00:34:04.758171 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 00:34:04.758178 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 00:34:04.758185 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 00:34:04.758192 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 00:34:04.758199 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 00:34:04.758206 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 00:34:04.758214 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 00:34:04.758221 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 00:34:04.758228 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 00:34:04.758234 systemd[1]: Reached target paths.target - Path Units. Sep 12 00:34:04.758241 systemd[1]: Reached target slices.target - Slice Units. Sep 12 00:34:04.758248 systemd[1]: Reached target swap.target - Swaps. Sep 12 00:34:04.758254 systemd[1]: Reached target timers.target - Timer Units. Sep 12 00:34:04.758261 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 00:34:04.758268 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 00:34:04.758275 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 00:34:04.758282 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 00:34:04.758288 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 00:34:04.758294 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 00:34:04.758301 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 00:34:04.758308 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 00:34:04.758314 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 00:34:04.758320 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 00:34:04.758328 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 00:34:04.758334 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 00:34:04.758341 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 00:34:04.758347 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 00:34:04.758354 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 00:34:04.758360 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:34:04.758366 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 00:34:04.758390 systemd-journald[244]: Collecting audit messages is disabled. Sep 12 00:34:04.758415 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 00:34:04.758422 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 00:34:04.758429 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 00:34:04.758436 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 00:34:04.758442 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 00:34:04.758449 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 00:34:04.758455 kernel: Bridge firewalling registered Sep 12 00:34:04.758462 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:34:04.758470 systemd-journald[244]: Journal started Sep 12 00:34:04.758485 systemd-journald[244]: Runtime Journal (/run/log/journal/6749a43a74f742d997106dde511b5993) is 4.8M, max 38.9M, 34M free. Sep 12 00:34:04.723719 systemd-modules-load[245]: Inserted module 'overlay' Sep 12 00:34:04.757192 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 12 00:34:04.762307 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 00:34:04.762244 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 00:34:04.762505 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 00:34:04.764696 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 00:34:04.766156 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 00:34:04.767253 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 00:34:04.776238 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 00:34:04.776785 systemd-tmpfiles[267]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 00:34:04.780021 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 00:34:04.781391 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 00:34:04.786922 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 00:34:04.791731 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 00:34:04.800897 dracut-cmdline[286]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=04dc7b04a37dcb996cc4d6074b142179401d5685abf61ddcbaff7d77d0988990 Sep 12 00:34:04.819250 systemd-resolved[277]: Positive Trust Anchors: Sep 12 00:34:04.819260 systemd-resolved[277]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 00:34:04.819283 systemd-resolved[277]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 00:34:04.822467 systemd-resolved[277]: Defaulting to hostname 'linux'. Sep 12 00:34:04.823347 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 00:34:04.823503 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 00:34:04.856104 kernel: SCSI subsystem initialized Sep 12 00:34:04.873094 kernel: Loading iSCSI transport class v2.0-870. Sep 12 00:34:04.881094 kernel: iscsi: registered transport (tcp) Sep 12 00:34:04.903224 kernel: iscsi: registered transport (qla4xxx) Sep 12 00:34:04.903257 kernel: QLogic iSCSI HBA Driver Sep 12 00:34:04.913382 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 00:34:04.926062 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 00:34:04.927250 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 00:34:04.949184 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 00:34:04.951154 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 00:34:04.990104 kernel: raid6: avx2x4 gen() 47198 MB/s Sep 12 00:34:05.007109 kernel: raid6: avx2x2 gen() 52359 MB/s Sep 12 00:34:05.024326 kernel: raid6: avx2x1 gen() 43501 MB/s Sep 12 00:34:05.024370 kernel: raid6: using algorithm avx2x2 gen() 52359 MB/s Sep 12 00:34:05.042333 kernel: raid6: .... xor() 31359 MB/s, rmw enabled Sep 12 00:34:05.042381 kernel: raid6: using avx2x2 recovery algorithm Sep 12 00:34:05.059098 kernel: xor: automatically using best checksumming function avx Sep 12 00:34:05.165100 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 00:34:05.168895 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 00:34:05.169889 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 00:34:05.192637 systemd-udevd[494]: Using default interface naming scheme 'v255'. Sep 12 00:34:05.196364 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 00:34:05.197429 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 00:34:05.209718 dracut-pre-trigger[498]: rd.md=0: removing MD RAID activation Sep 12 00:34:05.223189 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 00:34:05.223964 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 00:34:05.299386 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 00:34:05.301390 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 00:34:05.382098 kernel: VMware PVSCSI driver - version 1.0.7.0-k Sep 12 00:34:05.387560 kernel: vmw_pvscsi: using 64bit dma Sep 12 00:34:05.387631 kernel: vmw_pvscsi: max_id: 16 Sep 12 00:34:05.387647 kernel: vmw_pvscsi: setting ring_pages to 8 Sep 12 00:34:05.393463 kernel: vmw_pvscsi: enabling reqCallThreshold Sep 12 00:34:05.393517 kernel: vmw_pvscsi: driver-based request coalescing enabled Sep 12 00:34:05.393541 kernel: vmw_pvscsi: using MSI-X Sep 12 00:34:05.396092 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Sep 12 00:34:05.406101 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Sep 12 00:34:05.408106 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Sep 12 00:34:05.425171 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Sep 12 00:34:05.426480 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 00:34:05.427146 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Sep 12 00:34:05.426572 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:34:05.427302 (udev-worker)[545]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 12 00:34:05.427905 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:34:05.430370 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:34:05.435097 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Sep 12 00:34:05.439096 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 00:34:05.440092 kernel: libata version 3.00 loaded. Sep 12 00:34:05.444135 kernel: ata_piix 0000:00:07.1: version 2.13 Sep 12 00:34:05.448099 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Sep 12 00:34:05.450154 kernel: scsi host1: ata_piix Sep 12 00:34:05.450258 kernel: scsi host2: ata_piix Sep 12 00:34:05.453587 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Sep 12 00:34:05.453632 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Sep 12 00:34:05.453769 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Sep 12 00:34:05.462100 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Sep 12 00:34:05.464094 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 00:34:05.464206 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Sep 12 00:34:05.465102 kernel: sd 0:0:0:0: [sda] Cache data unavailable Sep 12 00:34:05.465185 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Sep 12 00:34:05.472110 kernel: AES CTR mode by8 optimization enabled Sep 12 00:34:05.471521 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:34:05.541109 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 00:34:05.541144 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 00:34:05.620121 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Sep 12 00:34:05.625094 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Sep 12 00:34:05.655141 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Sep 12 00:34:05.655310 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 00:34:05.673100 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 12 00:34:05.774043 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Sep 12 00:34:05.780248 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Sep 12 00:34:05.787002 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 12 00:34:05.793177 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Sep 12 00:34:05.793395 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Sep 12 00:34:05.794321 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 00:34:05.843121 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 00:34:05.971169 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 00:34:05.971759 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 00:34:05.972169 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 00:34:05.972300 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 00:34:05.973043 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 00:34:05.986803 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 00:34:06.855106 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 00:34:06.855722 disk-uuid[647]: The operation has completed successfully. Sep 12 00:34:06.895062 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 00:34:06.895390 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 00:34:06.911384 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 00:34:06.923893 sh[677]: Success Sep 12 00:34:06.937389 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 00:34:06.937427 kernel: device-mapper: uevent: version 1.0.3 Sep 12 00:34:06.938575 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 00:34:06.946108 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 12 00:34:07.007526 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 00:34:07.008821 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 00:34:07.027536 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 00:34:07.040099 kernel: BTRFS: device fsid e375903e-484e-4702-81f7-5fa3109f1a1c devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (689) Sep 12 00:34:07.042610 kernel: BTRFS info (device dm-0): first mount of filesystem e375903e-484e-4702-81f7-5fa3109f1a1c Sep 12 00:34:07.042638 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 00:34:07.051796 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 00:34:07.051837 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 00:34:07.051845 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 00:34:07.053470 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 00:34:07.053843 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 00:34:07.054634 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Sep 12 00:34:07.056165 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 00:34:07.090097 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (712) Sep 12 00:34:07.093677 kernel: BTRFS info (device sda6): first mount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:34:07.093713 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 00:34:07.098162 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 00:34:07.098200 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 00:34:07.102097 kernel: BTRFS info (device sda6): last unmount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:34:07.105595 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 00:34:07.108049 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 00:34:07.140484 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 12 00:34:07.141502 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 00:34:07.220838 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 00:34:07.223516 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 00:34:07.238630 ignition[731]: Ignition 2.21.0 Sep 12 00:34:07.238878 ignition[731]: Stage: fetch-offline Sep 12 00:34:07.238980 ignition[731]: no configs at "/usr/lib/ignition/base.d" Sep 12 00:34:07.238985 ignition[731]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 00:34:07.239034 ignition[731]: parsed url from cmdline: "" Sep 12 00:34:07.239036 ignition[731]: no config URL provided Sep 12 00:34:07.239040 ignition[731]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 00:34:07.239043 ignition[731]: no config at "/usr/lib/ignition/user.ign" Sep 12 00:34:07.239422 ignition[731]: config successfully fetched Sep 12 00:34:07.239440 ignition[731]: parsing config with SHA512: 7bdf42b8b42e325f939bb2c1d41cf3ecc84c0f9e51bd5de16c4b29c875a72afe7ab3bc49b748da4b3314d6a43c95d5a4e963760c2c55be4ac737732164d8f2da Sep 12 00:34:07.244621 unknown[731]: fetched base config from "system" Sep 12 00:34:07.244788 unknown[731]: fetched user config from "vmware" Sep 12 00:34:07.245226 ignition[731]: fetch-offline: fetch-offline passed Sep 12 00:34:07.245383 ignition[731]: Ignition finished successfully Sep 12 00:34:07.246941 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 00:34:07.247701 systemd-networkd[867]: lo: Link UP Sep 12 00:34:07.247708 systemd-networkd[867]: lo: Gained carrier Sep 12 00:34:07.248680 systemd-networkd[867]: Enumeration completed Sep 12 00:34:07.249040 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 00:34:07.249235 systemd[1]: Reached target network.target - Network. Sep 12 00:34:07.249298 systemd-networkd[867]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Sep 12 00:34:07.250212 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 00:34:07.251218 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 00:34:07.253121 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 12 00:34:07.253232 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 12 00:34:07.254540 systemd-networkd[867]: ens192: Link UP Sep 12 00:34:07.254546 systemd-networkd[867]: ens192: Gained carrier Sep 12 00:34:07.268695 ignition[871]: Ignition 2.21.0 Sep 12 00:34:07.268966 ignition[871]: Stage: kargs Sep 12 00:34:07.269155 ignition[871]: no configs at "/usr/lib/ignition/base.d" Sep 12 00:34:07.269271 ignition[871]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 00:34:07.270275 ignition[871]: kargs: kargs passed Sep 12 00:34:07.270408 ignition[871]: Ignition finished successfully Sep 12 00:34:07.271726 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 00:34:07.272600 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 00:34:07.288467 ignition[878]: Ignition 2.21.0 Sep 12 00:34:07.288475 ignition[878]: Stage: disks Sep 12 00:34:07.288565 ignition[878]: no configs at "/usr/lib/ignition/base.d" Sep 12 00:34:07.288571 ignition[878]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 00:34:07.289192 ignition[878]: disks: disks passed Sep 12 00:34:07.289225 ignition[878]: Ignition finished successfully Sep 12 00:34:07.290076 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 00:34:07.290584 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 00:34:07.290853 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 00:34:07.291136 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 00:34:07.291370 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 00:34:07.291579 systemd[1]: Reached target basic.target - Basic System. Sep 12 00:34:07.292338 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 00:34:07.467582 systemd-resolved[277]: Detected conflict on linux IN A 139.178.70.108 Sep 12 00:34:07.467595 systemd-resolved[277]: Hostname conflict, changing published hostname from 'linux' to 'linux4'. Sep 12 00:34:07.536899 systemd-fsck[887]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 12 00:34:07.543589 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 00:34:07.544519 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 00:34:07.911102 kernel: EXT4-fs (sda9): mounted filesystem c7fbf20f-7fc7-47c1-8781-0f8569841f1e r/w with ordered data mode. Quota mode: none. Sep 12 00:34:07.911197 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 00:34:07.911524 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 00:34:07.912871 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 00:34:07.915121 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 00:34:07.915508 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 00:34:07.915535 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 00:34:07.915550 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 00:34:07.918725 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 00:34:07.919479 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 00:34:07.926485 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (895) Sep 12 00:34:07.928665 kernel: BTRFS info (device sda6): first mount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:34:07.928688 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 00:34:07.934096 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 00:34:07.934135 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 00:34:07.935409 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 00:34:07.952678 initrd-setup-root[919]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 00:34:07.955651 initrd-setup-root[926]: cut: /sysroot/etc/group: No such file or directory Sep 12 00:34:07.957885 initrd-setup-root[933]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 00:34:07.960270 initrd-setup-root[940]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 00:34:08.022141 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 00:34:08.023145 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 00:34:08.024190 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 00:34:08.034092 kernel: BTRFS info (device sda6): last unmount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:34:08.039183 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 00:34:08.049192 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 00:34:08.056967 ignition[1007]: INFO : Ignition 2.21.0 Sep 12 00:34:08.056967 ignition[1007]: INFO : Stage: mount Sep 12 00:34:08.057459 ignition[1007]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 00:34:08.057459 ignition[1007]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 00:34:08.058208 ignition[1007]: INFO : mount: mount passed Sep 12 00:34:08.058324 ignition[1007]: INFO : Ignition finished successfully Sep 12 00:34:08.059455 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 00:34:08.060184 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 00:34:08.072997 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 00:34:08.094114 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1019) Sep 12 00:34:08.094151 kernel: BTRFS info (device sda6): first mount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:34:08.096095 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 00:34:08.100104 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 00:34:08.100160 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 00:34:08.101306 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 00:34:08.117476 ignition[1036]: INFO : Ignition 2.21.0 Sep 12 00:34:08.117476 ignition[1036]: INFO : Stage: files Sep 12 00:34:08.117868 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 00:34:08.117868 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 00:34:08.118514 ignition[1036]: DEBUG : files: compiled without relabeling support, skipping Sep 12 00:34:08.119337 ignition[1036]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 00:34:08.119337 ignition[1036]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 00:34:08.120995 ignition[1036]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 00:34:08.121209 ignition[1036]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 00:34:08.121435 unknown[1036]: wrote ssh authorized keys file for user: core Sep 12 00:34:08.121642 ignition[1036]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 00:34:08.123978 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 12 00:34:08.124243 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 12 00:34:08.159515 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 00:34:08.257105 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 12 00:34:08.257105 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 00:34:08.257663 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 00:34:08.257663 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 00:34:08.257663 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 00:34:08.257663 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 00:34:08.257663 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 00:34:08.257663 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 00:34:08.257663 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 00:34:08.260246 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 00:34:08.260467 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 00:34:08.260467 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 00:34:08.262849 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 00:34:08.263157 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 00:34:08.263157 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 12 00:34:08.737673 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 00:34:08.912152 systemd-networkd[867]: ens192: Gained IPv6LL Sep 12 00:34:09.093599 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 00:34:09.093599 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 12 00:34:09.095246 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Sep 12 00:34:09.095488 ignition[1036]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Sep 12 00:34:09.095793 ignition[1036]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 00:34:09.096207 ignition[1036]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 00:34:09.096207 ignition[1036]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Sep 12 00:34:09.096207 ignition[1036]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Sep 12 00:34:09.096733 ignition[1036]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 00:34:09.096733 ignition[1036]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 00:34:09.096733 ignition[1036]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Sep 12 00:34:09.096733 ignition[1036]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 00:34:09.128019 ignition[1036]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 00:34:09.131230 ignition[1036]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 00:34:09.131529 ignition[1036]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 00:34:09.131529 ignition[1036]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Sep 12 00:34:09.131529 ignition[1036]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 00:34:09.131529 ignition[1036]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 00:34:09.133458 ignition[1036]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 00:34:09.133458 ignition[1036]: INFO : files: files passed Sep 12 00:34:09.133458 ignition[1036]: INFO : Ignition finished successfully Sep 12 00:34:09.134141 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 00:34:09.135248 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 00:34:09.136173 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 00:34:09.155679 initrd-setup-root-after-ignition[1067]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 00:34:09.155679 initrd-setup-root-after-ignition[1067]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 00:34:09.156836 initrd-setup-root-after-ignition[1071]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 00:34:09.158815 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 00:34:09.158998 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 00:34:09.159450 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 00:34:09.160289 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 00:34:09.161030 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 00:34:09.186649 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 00:34:09.186745 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 00:34:09.187040 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 00:34:09.187179 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 00:34:09.187385 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 00:34:09.187912 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 00:34:09.209501 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 00:34:09.210678 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 00:34:09.223827 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 00:34:09.224038 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 00:34:09.224272 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 00:34:09.224469 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 00:34:09.224552 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 00:34:09.224940 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 00:34:09.225156 systemd[1]: Stopped target basic.target - Basic System. Sep 12 00:34:09.225313 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 00:34:09.225516 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 00:34:09.225737 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 00:34:09.225958 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 00:34:09.226179 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 00:34:09.226394 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 00:34:09.226611 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 00:34:09.226828 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 00:34:09.227020 systemd[1]: Stopped target swap.target - Swaps. Sep 12 00:34:09.227240 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 00:34:09.227323 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 00:34:09.227593 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 00:34:09.227830 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 00:34:09.228027 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 00:34:09.228077 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 00:34:09.228241 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 00:34:09.228306 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 00:34:09.228618 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 00:34:09.228684 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 00:34:09.228899 systemd[1]: Stopped target paths.target - Path Units. Sep 12 00:34:09.229043 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 00:34:09.229113 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 00:34:09.229285 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 00:34:09.229494 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 00:34:09.229666 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 00:34:09.229719 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 00:34:09.229880 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 00:34:09.229926 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 00:34:09.230150 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 00:34:09.230223 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 00:34:09.230469 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 00:34:09.230532 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 00:34:09.232263 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 00:34:09.232372 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 00:34:09.232448 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 00:34:09.234253 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 00:34:09.234424 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 00:34:09.234542 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 00:34:09.236326 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 00:34:09.236402 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 00:34:09.240450 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 00:34:09.240721 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 00:34:09.248486 ignition[1092]: INFO : Ignition 2.21.0 Sep 12 00:34:09.248486 ignition[1092]: INFO : Stage: umount Sep 12 00:34:09.250044 ignition[1092]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 00:34:09.250044 ignition[1092]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Sep 12 00:34:09.250044 ignition[1092]: INFO : umount: umount passed Sep 12 00:34:09.250044 ignition[1092]: INFO : Ignition finished successfully Sep 12 00:34:09.251435 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 00:34:09.251663 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 00:34:09.251984 systemd[1]: Stopped target network.target - Network. Sep 12 00:34:09.252193 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 00:34:09.252256 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 00:34:09.252636 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 00:34:09.252755 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 00:34:09.252990 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 00:34:09.253116 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 00:34:09.253346 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 00:34:09.253466 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 00:34:09.254232 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 00:34:09.254538 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 00:34:09.259176 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 00:34:09.259408 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 00:34:09.260983 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 00:34:09.261571 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 00:34:09.261641 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 00:34:09.263518 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 00:34:09.263678 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 00:34:09.264017 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 00:34:09.265204 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 00:34:09.265484 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 00:34:09.265939 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 00:34:09.265965 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 00:34:09.267104 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 00:34:09.267325 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 00:34:09.267477 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 00:34:09.267746 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Sep 12 00:34:09.267874 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Sep 12 00:34:09.268136 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 00:34:09.268160 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 00:34:09.268576 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 00:34:09.268703 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 00:34:09.268959 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 00:34:09.271281 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 00:34:09.279050 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 00:34:09.280836 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 00:34:09.280904 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 00:34:09.286496 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 00:34:09.286627 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 00:34:09.287252 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 00:34:09.287295 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 00:34:09.287549 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 00:34:09.287572 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 00:34:09.287733 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 00:34:09.287772 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 00:34:09.288060 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 00:34:09.288127 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 00:34:09.288430 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 00:34:09.288465 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 00:34:09.289359 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 00:34:09.289475 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 00:34:09.289508 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 00:34:09.289695 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 00:34:09.289719 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 00:34:09.289966 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 00:34:09.289989 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:34:09.297642 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 00:34:09.297999 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 00:34:09.519839 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 00:34:09.519910 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 00:34:09.520210 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 00:34:09.520337 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 00:34:09.520364 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 00:34:09.520983 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 00:34:09.538734 systemd[1]: Switching root. Sep 12 00:34:09.586557 systemd-journald[244]: Journal stopped Sep 12 00:34:12.052779 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 12 00:34:12.052817 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 00:34:12.052826 kernel: SELinux: policy capability open_perms=1 Sep 12 00:34:12.052832 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 00:34:12.052838 kernel: SELinux: policy capability always_check_network=0 Sep 12 00:34:12.052845 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 00:34:12.052851 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 00:34:12.052857 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 00:34:12.052862 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 00:34:12.052868 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 00:34:12.052874 kernel: audit: type=1403 audit(1757637250.270:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 00:34:12.052881 systemd[1]: Successfully loaded SELinux policy in 98.699ms. Sep 12 00:34:12.052890 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.083ms. Sep 12 00:34:12.052898 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 00:34:12.052905 systemd[1]: Detected virtualization vmware. Sep 12 00:34:12.052912 systemd[1]: Detected architecture x86-64. Sep 12 00:34:12.052920 systemd[1]: Detected first boot. Sep 12 00:34:12.052927 systemd[1]: Initializing machine ID from random generator. Sep 12 00:34:12.052934 zram_generator::config[1135]: No configuration found. Sep 12 00:34:12.053035 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Sep 12 00:34:12.053047 kernel: Guest personality initialized and is active Sep 12 00:34:12.053054 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 00:34:12.053060 kernel: Initialized host personality Sep 12 00:34:12.053069 kernel: NET: Registered PF_VSOCK protocol family Sep 12 00:34:12.053076 systemd[1]: Populated /etc with preset unit settings. Sep 12 00:34:12.053094 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 00:34:12.053102 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Sep 12 00:34:12.053109 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 00:34:12.053116 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 00:34:12.053123 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 00:34:12.053132 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 00:34:12.053139 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 00:34:12.053146 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 00:34:12.053153 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 00:34:12.053160 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 00:34:12.053167 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 00:34:12.053174 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 00:34:12.053183 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 00:34:12.053190 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 00:34:12.053197 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 00:34:12.053206 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 00:34:12.053213 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 00:34:12.053220 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 00:34:12.053227 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 00:34:12.053235 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 00:34:12.053243 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 00:34:12.053250 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 00:34:12.053257 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 00:34:12.053264 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 00:34:12.053271 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 00:34:12.053278 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 00:34:12.053286 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 00:34:12.053293 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 00:34:12.053301 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 00:34:12.053309 systemd[1]: Reached target slices.target - Slice Units. Sep 12 00:34:12.053316 systemd[1]: Reached target swap.target - Swaps. Sep 12 00:34:12.053323 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 00:34:12.053330 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 00:34:12.053339 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 00:34:12.053347 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 00:34:12.053355 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 00:34:12.053362 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 00:34:12.053370 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 00:34:12.053377 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 00:34:12.053384 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 00:34:12.053391 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 00:34:12.053400 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:34:12.053407 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 00:34:12.053415 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 00:34:12.053422 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 00:34:12.053429 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 00:34:12.053436 systemd[1]: Reached target machines.target - Containers. Sep 12 00:34:12.053444 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 00:34:12.053451 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Sep 12 00:34:12.053459 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 00:34:12.053466 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 00:34:12.053473 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 00:34:12.053480 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 00:34:12.053487 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 00:34:12.053495 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 00:34:12.053502 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 00:34:12.053510 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 00:34:12.053518 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 00:34:12.053526 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 00:34:12.053533 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 00:34:12.053540 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 00:34:12.053548 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 00:34:12.053556 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 00:34:12.053563 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 00:34:12.053570 kernel: fuse: init (API version 7.41) Sep 12 00:34:12.053577 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 00:34:12.053586 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 00:34:12.053593 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 00:34:12.053600 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 00:34:12.053607 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 00:34:12.053614 systemd[1]: Stopped verity-setup.service. Sep 12 00:34:12.053622 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:34:12.053629 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 00:34:12.053636 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 00:34:12.053644 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 00:34:12.053651 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 00:34:12.053658 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 00:34:12.053666 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 00:34:12.053673 kernel: loop: module loaded Sep 12 00:34:12.053680 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 00:34:12.053687 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 00:34:12.053694 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 00:34:12.053701 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 00:34:12.053710 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 00:34:12.053717 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 00:34:12.053725 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 00:34:12.053732 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 00:34:12.053739 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 00:34:12.053747 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 00:34:12.053754 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 00:34:12.053761 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 00:34:12.053769 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 00:34:12.053777 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 00:34:12.053787 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 00:34:12.053796 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 00:34:12.053803 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 00:34:12.053811 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 00:34:12.053818 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 00:34:12.053830 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 00:34:12.053837 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 00:34:12.053845 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 00:34:12.053869 systemd-journald[1218]: Collecting audit messages is disabled. Sep 12 00:34:12.053888 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 00:34:12.053897 systemd-journald[1218]: Journal started Sep 12 00:34:12.053912 systemd-journald[1218]: Runtime Journal (/run/log/journal/8bcfecc7a1034effb17f4e38c8725ad1) is 4.8M, max 38.9M, 34M free. Sep 12 00:34:11.795206 systemd[1]: Queued start job for default target multi-user.target. Sep 12 00:34:12.057184 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 00:34:12.057215 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 00:34:11.821929 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 00:34:11.822204 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 00:34:12.057471 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 00:34:12.057593 jq[1205]: true Sep 12 00:34:12.057875 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 00:34:12.058181 jq[1230]: true Sep 12 00:34:12.058519 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 00:34:12.060554 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 00:34:12.060961 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 00:34:12.078007 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 00:34:12.081270 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 00:34:12.086642 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 00:34:12.091205 kernel: loop0: detected capacity change from 0 to 113872 Sep 12 00:34:12.115126 systemd-journald[1218]: Time spent on flushing to /var/log/journal/8bcfecc7a1034effb17f4e38c8725ad1 is 104.997ms for 1757 entries. Sep 12 00:34:12.115126 systemd-journald[1218]: System Journal (/var/log/journal/8bcfecc7a1034effb17f4e38c8725ad1) is 8M, max 584.8M, 576.8M free. Sep 12 00:34:12.234181 systemd-journald[1218]: Received client request to flush runtime journal. Sep 12 00:34:12.234213 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 00:34:12.234228 kernel: ACPI: bus type drm_connector registered Sep 12 00:34:12.234238 kernel: loop1: detected capacity change from 0 to 229808 Sep 12 00:34:12.136917 ignition[1231]: Ignition 2.21.0 Sep 12 00:34:12.115894 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 00:34:12.142256 ignition[1231]: deleting config from guestinfo properties Sep 12 00:34:12.116120 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 00:34:12.160527 ignition[1231]: Successfully deleted config Sep 12 00:34:12.118366 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 00:34:12.164546 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Sep 12 00:34:12.169746 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 00:34:12.170500 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 00:34:12.193643 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 00:34:12.198434 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 00:34:12.212909 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 00:34:12.214020 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 00:34:12.236809 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 00:34:12.260690 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 00:34:12.264130 kernel: loop2: detected capacity change from 0 to 146240 Sep 12 00:34:12.264010 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 00:34:12.308387 kernel: loop3: detected capacity change from 0 to 2960 Sep 12 00:34:12.311695 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. Sep 12 00:34:12.312572 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. Sep 12 00:34:12.321064 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 00:34:12.342121 kernel: loop4: detected capacity change from 0 to 113872 Sep 12 00:34:12.345037 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 00:34:12.493097 kernel: loop5: detected capacity change from 0 to 229808 Sep 12 00:34:12.537117 kernel: loop6: detected capacity change from 0 to 146240 Sep 12 00:34:12.597131 kernel: loop7: detected capacity change from 0 to 2960 Sep 12 00:34:12.637888 (sd-merge)[1309]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Sep 12 00:34:12.638321 (sd-merge)[1309]: Merged extensions into '/usr'. Sep 12 00:34:12.641783 systemd[1]: Reload requested from client PID 1239 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 00:34:12.641793 systemd[1]: Reloading... Sep 12 00:34:12.702272 zram_generator::config[1335]: No configuration found. Sep 12 00:34:12.820342 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 00:34:12.831111 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 00:34:12.881067 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 00:34:12.881350 systemd[1]: Reloading finished in 239 ms. Sep 12 00:34:12.898143 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 00:34:12.904174 systemd[1]: Starting ensure-sysext.service... Sep 12 00:34:12.905244 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 00:34:12.930404 systemd[1]: Reload requested from client PID 1391 ('systemctl') (unit ensure-sysext.service)... Sep 12 00:34:12.930419 systemd[1]: Reloading... Sep 12 00:34:12.942780 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 00:34:12.951439 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 00:34:12.951645 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 00:34:12.951807 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 00:34:12.952321 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 00:34:12.952489 systemd-tmpfiles[1392]: ACLs are not supported, ignoring. Sep 12 00:34:12.952522 systemd-tmpfiles[1392]: ACLs are not supported, ignoring. Sep 12 00:34:12.971096 zram_generator::config[1420]: No configuration found. Sep 12 00:34:12.986063 systemd-tmpfiles[1392]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 00:34:12.986071 systemd-tmpfiles[1392]: Skipping /boot Sep 12 00:34:13.010958 systemd-tmpfiles[1392]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 00:34:13.013219 systemd-tmpfiles[1392]: Skipping /boot Sep 12 00:34:13.081257 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 00:34:13.090282 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 00:34:13.140697 systemd[1]: Reloading finished in 210 ms. Sep 12 00:34:13.147983 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 00:34:13.148456 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 00:34:13.157612 ldconfig[1234]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 00:34:13.163282 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 00:34:13.176339 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 00:34:13.179590 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 00:34:13.185942 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 00:34:13.187643 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 00:34:13.190426 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 00:34:13.192125 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 00:34:13.197274 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:34:13.200855 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 00:34:13.203841 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 00:34:13.205566 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 00:34:13.205885 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 00:34:13.206007 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 00:34:13.206076 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:34:13.209181 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:34:13.209289 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 00:34:13.209345 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 00:34:13.214161 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 00:34:13.214377 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:34:13.215254 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 00:34:13.216303 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 00:34:13.224805 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:34:13.228395 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 00:34:13.229475 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 00:34:13.230113 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 00:34:13.230200 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 00:34:13.230308 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:34:13.231478 systemd[1]: Finished ensure-sysext.service. Sep 12 00:34:13.239345 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 00:34:13.239807 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 00:34:13.243656 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 00:34:13.244526 systemd-udevd[1488]: Using default interface naming scheme 'v255'. Sep 12 00:34:13.245169 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 00:34:13.249496 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 00:34:13.250574 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 00:34:13.251063 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 00:34:13.252409 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 00:34:13.258329 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 00:34:13.271849 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 00:34:13.272791 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 00:34:13.273899 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 00:34:13.274946 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 00:34:13.275478 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 00:34:13.276788 augenrules[1521]: No rules Sep 12 00:34:13.277554 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 00:34:13.277840 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 00:34:13.289401 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 00:34:13.302091 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 00:34:13.358739 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 00:34:13.358947 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 00:34:13.364738 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 00:34:13.366868 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 00:34:13.367518 systemd-resolved[1482]: Positive Trust Anchors: Sep 12 00:34:13.367524 systemd-resolved[1482]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 00:34:13.367548 systemd-resolved[1482]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 00:34:13.375833 systemd-resolved[1482]: Defaulting to hostname 'linux'. Sep 12 00:34:13.377830 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 00:34:13.378162 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 00:34:13.460228 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 00:34:13.466148 systemd-networkd[1536]: lo: Link UP Sep 12 00:34:13.466153 systemd-networkd[1536]: lo: Gained carrier Sep 12 00:34:13.467329 systemd-networkd[1536]: Enumeration completed Sep 12 00:34:13.467411 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 00:34:13.467601 systemd[1]: Reached target network.target - Network. Sep 12 00:34:13.467992 systemd-networkd[1536]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Sep 12 00:34:13.470479 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Sep 12 00:34:13.470645 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Sep 12 00:34:13.468828 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 00:34:13.471182 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 00:34:13.472290 systemd-networkd[1536]: ens192: Link UP Sep 12 00:34:13.472693 systemd-networkd[1536]: ens192: Gained carrier Sep 12 00:34:13.477213 systemd-timesyncd[1509]: Network configuration changed, trying to establish connection. Sep 12 00:34:13.495536 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 00:34:13.496241 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 00:34:13.496268 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 00:34:13.496697 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 00:34:13.496974 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 00:34:13.497755 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 00:34:13.498377 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 00:34:13.498674 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 00:34:13.499135 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 00:34:13.499271 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 00:34:13.499291 systemd[1]: Reached target paths.target - Path Units. Sep 12 00:34:13.499490 systemd[1]: Reached target timers.target - Timer Units. Sep 12 00:34:13.501129 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 00:34:13.504272 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 00:34:13.507845 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 00:34:13.508340 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 00:34:13.509272 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 00:34:13.516983 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 00:34:13.517402 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 00:34:13.518140 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 00:34:13.518485 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 00:34:13.522649 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 00:34:13.523282 systemd[1]: Reached target basic.target - Basic System. Sep 12 00:34:13.523486 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 00:34:13.523501 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 00:34:13.525510 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 00:34:13.528213 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 00:34:13.532209 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 00:34:13.534052 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 00:34:13.538153 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 00:34:13.540915 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 00:34:13.541062 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 00:34:13.544794 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 00:34:13.545132 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 12 00:34:13.549570 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 00:34:13.552038 jq[1578]: false Sep 12 00:34:13.553111 kernel: ACPI: button: Power Button [PWRF] Sep 12 00:34:13.552817 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 00:34:13.555189 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 00:34:13.557376 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 00:34:13.559774 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 00:34:13.561668 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 00:34:13.562282 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 00:34:13.564269 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 00:34:13.568036 oslogin_cache_refresh[1581]: Refreshing passwd entry cache Sep 12 00:34:13.568935 google_oslogin_nss_cache[1581]: oslogin_cache_refresh[1581]: Refreshing passwd entry cache Sep 12 00:34:13.569172 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 00:34:13.579304 jq[1589]: true Sep 12 00:34:13.579484 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Sep 12 00:34:13.583767 google_oslogin_nss_cache[1581]: oslogin_cache_refresh[1581]: Failure getting users, quitting Sep 12 00:34:13.583767 google_oslogin_nss_cache[1581]: oslogin_cache_refresh[1581]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 00:34:13.583767 google_oslogin_nss_cache[1581]: oslogin_cache_refresh[1581]: Refreshing group entry cache Sep 12 00:34:13.583455 oslogin_cache_refresh[1581]: Failure getting users, quitting Sep 12 00:34:13.583466 oslogin_cache_refresh[1581]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 00:34:13.583496 oslogin_cache_refresh[1581]: Refreshing group entry cache Sep 12 00:34:13.585261 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 00:34:13.585569 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 00:34:13.585706 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 00:34:13.587269 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 00:34:13.587407 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 00:34:13.591953 google_oslogin_nss_cache[1581]: oslogin_cache_refresh[1581]: Failure getting groups, quitting Sep 12 00:34:13.591953 google_oslogin_nss_cache[1581]: oslogin_cache_refresh[1581]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 00:34:13.591766 oslogin_cache_refresh[1581]: Failure getting groups, quitting Sep 12 00:34:13.591775 oslogin_cache_refresh[1581]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 00:34:13.593754 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 00:34:13.597693 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 00:34:13.599282 update_engine[1588]: I20250912 00:34:13.598893 1588 main.cc:92] Flatcar Update Engine starting Sep 12 00:34:13.619091 jq[1599]: true Sep 12 00:34:13.620742 extend-filesystems[1580]: Found /dev/sda6 Sep 12 00:34:13.627886 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 00:34:13.628347 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 00:34:13.632255 extend-filesystems[1580]: Found /dev/sda9 Sep 12 00:34:13.636230 extend-filesystems[1580]: Checking size of /dev/sda9 Sep 12 00:34:13.637851 (ntainerd)[1615]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 00:34:13.638179 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Sep 12 00:34:13.641032 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Sep 12 00:34:13.656402 tar[1598]: linux-amd64/LICENSE Sep 12 00:34:13.656402 tar[1598]: linux-amd64/helm Sep 12 00:34:13.672387 dbus-daemon[1576]: [system] SELinux support is enabled Sep 12 00:34:13.673941 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 00:34:13.675882 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 00:34:13.675902 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 00:34:13.679199 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 00:34:13.679217 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 00:34:13.711536 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Sep 12 00:34:13.712174 update_engine[1588]: I20250912 00:34:13.712134 1588 update_check_scheduler.cc:74] Next update check in 5m0s Sep 12 00:34:13.713325 systemd[1]: Started update-engine.service - Update Engine. Sep 12 00:34:13.714709 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 00:34:13.726060 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 00:34:13.734855 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Sep 12 00:34:13.744765 extend-filesystems[1580]: Old size kept for /dev/sda9 Sep 12 00:34:13.748997 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 00:34:13.749179 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 00:34:13.749693 systemd-logind[1587]: New seat seat0. Sep 12 00:34:13.750719 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 00:34:13.761860 unknown[1621]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Sep 12 00:34:13.764360 unknown[1621]: Core dump limit set to -1 Sep 12 00:34:13.797335 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 00:34:13.992532 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Sep 12 00:34:13.885775 locksmithd[1644]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 00:34:13.975513 (udev-worker)[1543]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Sep 12 00:34:13.986474 systemd-logind[1587]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 00:34:13.989714 systemd-logind[1587]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 00:34:13.993037 bash[1642]: Updated "/home/core/.ssh/authorized_keys" Sep 12 00:34:13.994696 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 00:34:13.995772 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 00:34:13.998254 sshd_keygen[1623]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 00:34:14.001101 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:34:14.084570 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 00:34:14.086883 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 00:34:14.115780 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 00:34:14.115998 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 00:34:14.137343 containerd[1615]: time="2025-09-12T00:34:14Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 00:34:14.138541 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 00:34:14.149036 containerd[1615]: time="2025-09-12T00:34:14.145392899Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 12 00:34:14.165852 containerd[1615]: time="2025-09-12T00:34:14.165729604Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.558µs" Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168114447Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168160874Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168277368Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168288703Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168307248Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168348480Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168356671Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168511229Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168519672Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168527073Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168532126Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 00:34:14.168908 containerd[1615]: time="2025-09-12T00:34:14.168582125Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 00:34:14.169182 containerd[1615]: time="2025-09-12T00:34:14.168712581Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 00:34:14.169182 containerd[1615]: time="2025-09-12T00:34:14.168730255Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 00:34:14.169182 containerd[1615]: time="2025-09-12T00:34:14.168736619Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 00:34:14.169182 containerd[1615]: time="2025-09-12T00:34:14.168757480Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 00:34:14.169182 containerd[1615]: time="2025-09-12T00:34:14.168928265Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 00:34:14.169182 containerd[1615]: time="2025-09-12T00:34:14.168980875Z" level=info msg="metadata content store policy set" policy=shared Sep 12 00:34:14.191168 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 00:34:14.210323 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 00:34:14.216953 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219334411Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219436930Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219466218Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219481113Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219491194Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219498238Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219508452Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219516725Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219524428Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219530340Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219545377Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219555267Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219686988Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 00:34:14.220497 containerd[1615]: time="2025-09-12T00:34:14.219717175Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219729203Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219741951Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219750211Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219759958Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219777455Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219786078Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219794765Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219801722Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219808609Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219872798Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219885863Z" level=info msg="Start snapshots syncer" Sep 12 00:34:14.220769 containerd[1615]: time="2025-09-12T00:34:14.219911082Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 00:34:14.220937 containerd[1615]: time="2025-09-12T00:34:14.220183871Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 00:34:14.220937 containerd[1615]: time="2025-09-12T00:34:14.220223431Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220289356Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220389192Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220414648Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220422762Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220432585Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220441701Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220450799Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220457669Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220476993Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220499570Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220509401Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220534430Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220546458Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 00:34:14.221024 containerd[1615]: time="2025-09-12T00:34:14.220552202Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220590811Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220599234Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220607228Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220616969Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220628167Z" level=info msg="runtime interface created" Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220631450Z" level=info msg="created NRI interface" Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220645323Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220654330Z" level=info msg="Connect containerd service" Sep 12 00:34:14.221968 containerd[1615]: time="2025-09-12T00:34:14.220673747Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 00:34:14.223668 containerd[1615]: time="2025-09-12T00:34:14.223537560Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 00:34:14.226374 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 00:34:14.240128 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:34:14.479538 containerd[1615]: time="2025-09-12T00:34:14.479499702Z" level=info msg="Start subscribing containerd event" Sep 12 00:34:14.479616 containerd[1615]: time="2025-09-12T00:34:14.479549055Z" level=info msg="Start recovering state" Sep 12 00:34:14.481110 containerd[1615]: time="2025-09-12T00:34:14.479639207Z" level=info msg="Start event monitor" Sep 12 00:34:14.481110 containerd[1615]: time="2025-09-12T00:34:14.479656653Z" level=info msg="Start cni network conf syncer for default" Sep 12 00:34:14.481110 containerd[1615]: time="2025-09-12T00:34:14.479666751Z" level=info msg="Start streaming server" Sep 12 00:34:14.481110 containerd[1615]: time="2025-09-12T00:34:14.479675426Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 00:34:14.481110 containerd[1615]: time="2025-09-12T00:34:14.479695770Z" level=info msg="runtime interface starting up..." Sep 12 00:34:14.481110 containerd[1615]: time="2025-09-12T00:34:14.479701495Z" level=info msg="starting plugins..." Sep 12 00:34:14.481110 containerd[1615]: time="2025-09-12T00:34:14.479715660Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 00:34:14.483947 containerd[1615]: time="2025-09-12T00:34:14.483869180Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 00:34:14.484639 containerd[1615]: time="2025-09-12T00:34:14.483960050Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 00:34:14.484639 containerd[1615]: time="2025-09-12T00:34:14.484002041Z" level=info msg="containerd successfully booted in 0.347263s" Sep 12 00:34:14.484182 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 00:34:14.493797 tar[1598]: linux-amd64/README.md Sep 12 00:34:14.508701 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 00:34:15.056218 systemd-networkd[1536]: ens192: Gained IPv6LL Sep 12 00:34:15.056663 systemd-timesyncd[1509]: Network configuration changed, trying to establish connection. Sep 12 00:34:15.061318 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 00:34:15.061893 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 00:34:15.063170 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Sep 12 00:34:15.073255 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:34:15.076419 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 00:34:15.156544 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 00:34:15.156719 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Sep 12 00:34:15.157148 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 00:34:15.165413 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 00:34:16.180587 systemd-timesyncd[1509]: Network configuration changed, trying to establish connection. Sep 12 00:34:16.739776 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:34:16.740407 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 00:34:16.740864 systemd[1]: Startup finished in 2.793s (kernel) + 5.631s (initrd) + 6.568s (userspace) = 14.992s. Sep 12 00:34:16.749419 (kubelet)[1799]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 00:34:16.771205 login[1720]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 00:34:16.771973 login[1721]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 00:34:16.776908 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 00:34:16.777769 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 00:34:16.783270 systemd-logind[1587]: New session 2 of user core. Sep 12 00:34:16.785941 systemd-logind[1587]: New session 1 of user core. Sep 12 00:34:16.795114 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 00:34:16.798248 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 00:34:16.806539 (systemd)[1806]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 00:34:16.808539 systemd-logind[1587]: New session c1 of user core. Sep 12 00:34:16.903872 systemd[1806]: Queued start job for default target default.target. Sep 12 00:34:16.912126 systemd[1806]: Created slice app.slice - User Application Slice. Sep 12 00:34:16.912144 systemd[1806]: Reached target paths.target - Paths. Sep 12 00:34:16.912170 systemd[1806]: Reached target timers.target - Timers. Sep 12 00:34:16.912891 systemd[1806]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 00:34:16.919324 systemd[1806]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 00:34:16.919410 systemd[1806]: Reached target sockets.target - Sockets. Sep 12 00:34:16.919474 systemd[1806]: Reached target basic.target - Basic System. Sep 12 00:34:16.919554 systemd[1806]: Reached target default.target - Main User Target. Sep 12 00:34:16.919574 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 00:34:16.919644 systemd[1806]: Startup finished in 106ms. Sep 12 00:34:16.929152 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 00:34:16.929762 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 00:34:17.812189 kubelet[1799]: E0912 00:34:17.812141 1799 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 00:34:17.813855 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 00:34:17.813941 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 00:34:17.814158 systemd[1]: kubelet.service: Consumed 666ms CPU time, 266.2M memory peak. Sep 12 00:34:27.966725 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 00:34:27.968124 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:34:28.419028 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:34:28.429377 (kubelet)[1850]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 00:34:28.468834 kubelet[1850]: E0912 00:34:28.468802 1850 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 00:34:28.471389 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 00:34:28.471528 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 00:34:28.471851 systemd[1]: kubelet.service: Consumed 93ms CPU time, 110.8M memory peak. Sep 12 00:34:38.716544 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 00:34:38.717940 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:34:39.059659 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:34:39.066311 (kubelet)[1865]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 00:34:39.132906 kubelet[1865]: E0912 00:34:39.132862 1865 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 00:34:39.134447 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 00:34:39.134555 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 00:34:39.134821 systemd[1]: kubelet.service: Consumed 114ms CPU time, 108.7M memory peak. Sep 12 00:34:44.035429 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 00:34:44.036456 systemd[1]: Started sshd@0-139.178.70.108:22-139.178.68.195:55360.service - OpenSSH per-connection server daemon (139.178.68.195:55360). Sep 12 00:34:44.092209 sshd[1873]: Accepted publickey for core from 139.178.68.195 port 55360 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:34:44.093193 sshd-session[1873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:34:44.096529 systemd-logind[1587]: New session 3 of user core. Sep 12 00:34:44.106262 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 00:34:44.161225 systemd[1]: Started sshd@1-139.178.70.108:22-139.178.68.195:55374.service - OpenSSH per-connection server daemon (139.178.68.195:55374). Sep 12 00:34:44.197287 sshd[1878]: Accepted publickey for core from 139.178.68.195 port 55374 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:34:44.198172 sshd-session[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:34:44.201114 systemd-logind[1587]: New session 4 of user core. Sep 12 00:34:44.211224 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 00:34:44.259411 sshd[1880]: Connection closed by 139.178.68.195 port 55374 Sep 12 00:34:44.260164 sshd-session[1878]: pam_unix(sshd:session): session closed for user core Sep 12 00:34:44.269447 systemd[1]: sshd@1-139.178.70.108:22-139.178.68.195:55374.service: Deactivated successfully. Sep 12 00:34:44.270526 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 00:34:44.271031 systemd-logind[1587]: Session 4 logged out. Waiting for processes to exit. Sep 12 00:34:44.272752 systemd[1]: Started sshd@2-139.178.70.108:22-139.178.68.195:55378.service - OpenSSH per-connection server daemon (139.178.68.195:55378). Sep 12 00:34:44.273457 systemd-logind[1587]: Removed session 4. Sep 12 00:34:44.311419 sshd[1886]: Accepted publickey for core from 139.178.68.195 port 55378 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:34:44.312032 sshd-session[1886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:34:44.314675 systemd-logind[1587]: New session 5 of user core. Sep 12 00:34:44.320197 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 00:34:44.365429 sshd[1888]: Connection closed by 139.178.68.195 port 55378 Sep 12 00:34:44.365778 sshd-session[1886]: pam_unix(sshd:session): session closed for user core Sep 12 00:34:44.375212 systemd[1]: sshd@2-139.178.70.108:22-139.178.68.195:55378.service: Deactivated successfully. Sep 12 00:34:44.376035 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 00:34:44.376807 systemd-logind[1587]: Session 5 logged out. Waiting for processes to exit. Sep 12 00:34:44.378003 systemd[1]: Started sshd@3-139.178.70.108:22-139.178.68.195:55380.service - OpenSSH per-connection server daemon (139.178.68.195:55380). Sep 12 00:34:44.379707 systemd-logind[1587]: Removed session 5. Sep 12 00:34:44.415188 sshd[1894]: Accepted publickey for core from 139.178.68.195 port 55380 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:34:44.415903 sshd-session[1894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:34:44.418623 systemd-logind[1587]: New session 6 of user core. Sep 12 00:34:44.424297 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 00:34:44.471595 sshd[1896]: Connection closed by 139.178.68.195 port 55380 Sep 12 00:34:44.471887 sshd-session[1894]: pam_unix(sshd:session): session closed for user core Sep 12 00:34:44.482168 systemd[1]: sshd@3-139.178.70.108:22-139.178.68.195:55380.service: Deactivated successfully. Sep 12 00:34:44.483188 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 00:34:44.483710 systemd-logind[1587]: Session 6 logged out. Waiting for processes to exit. Sep 12 00:34:44.485346 systemd-logind[1587]: Removed session 6. Sep 12 00:34:44.486218 systemd[1]: Started sshd@4-139.178.70.108:22-139.178.68.195:55386.service - OpenSSH per-connection server daemon (139.178.68.195:55386). Sep 12 00:34:44.521740 sshd[1902]: Accepted publickey for core from 139.178.68.195 port 55386 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:34:44.522548 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:34:44.525825 systemd-logind[1587]: New session 7 of user core. Sep 12 00:34:44.535259 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 00:34:44.590026 sudo[1905]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 00:34:44.590209 sudo[1905]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 00:34:44.596387 sudo[1905]: pam_unix(sudo:session): session closed for user root Sep 12 00:34:44.597146 sshd[1904]: Connection closed by 139.178.68.195 port 55386 Sep 12 00:34:44.597561 sshd-session[1902]: pam_unix(sshd:session): session closed for user core Sep 12 00:34:44.611316 systemd[1]: sshd@4-139.178.70.108:22-139.178.68.195:55386.service: Deactivated successfully. Sep 12 00:34:44.612843 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 00:34:44.613947 systemd-logind[1587]: Session 7 logged out. Waiting for processes to exit. Sep 12 00:34:44.616231 systemd[1]: Started sshd@5-139.178.70.108:22-139.178.68.195:55388.service - OpenSSH per-connection server daemon (139.178.68.195:55388). Sep 12 00:34:44.617199 systemd-logind[1587]: Removed session 7. Sep 12 00:34:44.657901 sshd[1911]: Accepted publickey for core from 139.178.68.195 port 55388 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:34:44.658838 sshd-session[1911]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:34:44.662117 systemd-logind[1587]: New session 8 of user core. Sep 12 00:34:44.668221 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 00:34:44.717279 sudo[1915]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 00:34:44.717433 sudo[1915]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 00:34:44.719679 sudo[1915]: pam_unix(sudo:session): session closed for user root Sep 12 00:34:44.722556 sudo[1914]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 00:34:44.722844 sudo[1914]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 00:34:44.728382 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 00:34:44.761486 augenrules[1937]: No rules Sep 12 00:34:44.762273 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 00:34:44.762422 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 00:34:44.762912 sudo[1914]: pam_unix(sudo:session): session closed for user root Sep 12 00:34:44.763694 sshd[1913]: Connection closed by 139.178.68.195 port 55388 Sep 12 00:34:44.763908 sshd-session[1911]: pam_unix(sshd:session): session closed for user core Sep 12 00:34:44.777482 systemd[1]: sshd@5-139.178.70.108:22-139.178.68.195:55388.service: Deactivated successfully. Sep 12 00:34:44.778593 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 00:34:44.779483 systemd-logind[1587]: Session 8 logged out. Waiting for processes to exit. Sep 12 00:34:44.780973 systemd[1]: Started sshd@6-139.178.70.108:22-139.178.68.195:55394.service - OpenSSH per-connection server daemon (139.178.68.195:55394). Sep 12 00:34:44.781745 systemd-logind[1587]: Removed session 8. Sep 12 00:34:44.820447 sshd[1946]: Accepted publickey for core from 139.178.68.195 port 55394 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:34:44.821344 sshd-session[1946]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:34:44.824072 systemd-logind[1587]: New session 9 of user core. Sep 12 00:34:44.839250 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 00:34:44.887279 sudo[1949]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 00:34:44.887448 sudo[1949]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 00:34:45.197660 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 00:34:45.208308 (dockerd)[1967]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 00:34:45.409718 dockerd[1967]: time="2025-09-12T00:34:45.409680390Z" level=info msg="Starting up" Sep 12 00:34:45.410209 dockerd[1967]: time="2025-09-12T00:34:45.410191212Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 00:34:45.439303 dockerd[1967]: time="2025-09-12T00:34:45.439279235Z" level=info msg="Loading containers: start." Sep 12 00:34:45.447136 kernel: Initializing XFRM netlink socket Sep 12 00:34:45.576849 systemd-timesyncd[1509]: Network configuration changed, trying to establish connection. Sep 12 00:34:45.601502 systemd-networkd[1536]: docker0: Link UP Sep 12 00:34:45.602464 dockerd[1967]: time="2025-09-12T00:34:45.602443582Z" level=info msg="Loading containers: done." Sep 12 00:36:18.380893 systemd-timesyncd[1509]: Contacted time server 23.186.168.130:123 (2.flatcar.pool.ntp.org). Sep 12 00:36:18.380931 systemd-timesyncd[1509]: Initial clock synchronization to Fri 2025-09-12 00:36:18.380600 UTC. Sep 12 00:36:18.381781 systemd-resolved[1482]: Clock change detected. Flushing caches. Sep 12 00:36:18.383102 dockerd[1967]: time="2025-09-12T00:36:18.383074085Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 00:36:18.383143 dockerd[1967]: time="2025-09-12T00:36:18.383128690Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 12 00:36:18.383213 dockerd[1967]: time="2025-09-12T00:36:18.383199893Z" level=info msg="Initializing buildkit" Sep 12 00:36:18.393056 dockerd[1967]: time="2025-09-12T00:36:18.393036028Z" level=info msg="Completed buildkit initialization" Sep 12 00:36:18.396959 dockerd[1967]: time="2025-09-12T00:36:18.396937322Z" level=info msg="Daemon has completed initialization" Sep 12 00:36:18.397032 dockerd[1967]: time="2025-09-12T00:36:18.397009806Z" level=info msg="API listen on /run/docker.sock" Sep 12 00:36:18.397106 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 00:36:19.253873 containerd[1615]: time="2025-09-12T00:36:19.253838450Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 12 00:36:19.936819 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount674926490.mount: Deactivated successfully. Sep 12 00:36:21.006711 containerd[1615]: time="2025-09-12T00:36:21.006649637Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:21.006711 containerd[1615]: time="2025-09-12T00:36:21.006688605Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=30114893" Sep 12 00:36:21.007461 containerd[1615]: time="2025-09-12T00:36:21.007439400Z" level=info msg="ImageCreate event name:\"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:21.009342 containerd[1615]: time="2025-09-12T00:36:21.009308271Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:21.010081 containerd[1615]: time="2025-09-12T00:36:21.010060251Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"30111492\" in 1.755929427s" Sep 12 00:36:21.010131 containerd[1615]: time="2025-09-12T00:36:21.010083228Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\"" Sep 12 00:36:21.010442 containerd[1615]: time="2025-09-12T00:36:21.010394048Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 12 00:36:21.989657 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 00:36:21.991466 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:36:22.401669 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:36:22.409642 (kubelet)[2239]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 00:36:22.444525 kubelet[2239]: E0912 00:36:22.444495 2239 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 00:36:22.445524 containerd[1615]: time="2025-09-12T00:36:22.444983614Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:22.446937 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 00:36:22.447025 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 00:36:22.447522 systemd[1]: kubelet.service: Consumed 107ms CPU time, 110.1M memory peak. Sep 12 00:36:22.451109 containerd[1615]: time="2025-09-12T00:36:22.451079658Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=26020844" Sep 12 00:36:22.455737 containerd[1615]: time="2025-09-12T00:36:22.455683496Z" level=info msg="ImageCreate event name:\"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:22.460953 containerd[1615]: time="2025-09-12T00:36:22.460902923Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:22.462502 containerd[1615]: time="2025-09-12T00:36:22.461793573Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"27681301\" in 1.451380465s" Sep 12 00:36:22.462502 containerd[1615]: time="2025-09-12T00:36:22.461824850Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\"" Sep 12 00:36:22.463965 containerd[1615]: time="2025-09-12T00:36:22.463940783Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 12 00:36:24.394746 containerd[1615]: time="2025-09-12T00:36:24.394230478Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:24.395352 containerd[1615]: time="2025-09-12T00:36:24.395323400Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=20155568" Sep 12 00:36:24.395647 containerd[1615]: time="2025-09-12T00:36:24.395635184Z" level=info msg="ImageCreate event name:\"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:24.397552 containerd[1615]: time="2025-09-12T00:36:24.397534944Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:24.397962 containerd[1615]: time="2025-09-12T00:36:24.397881333Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"21816043\" in 1.933915914s" Sep 12 00:36:24.398024 containerd[1615]: time="2025-09-12T00:36:24.398009364Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\"" Sep 12 00:36:24.399064 containerd[1615]: time="2025-09-12T00:36:24.398878986Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 12 00:36:25.663618 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount520600813.mount: Deactivated successfully. Sep 12 00:36:26.059348 containerd[1615]: time="2025-09-12T00:36:26.059317233Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:26.069067 containerd[1615]: time="2025-09-12T00:36:26.069035848Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=31929469" Sep 12 00:36:26.080791 containerd[1615]: time="2025-09-12T00:36:26.080749468Z" level=info msg="ImageCreate event name:\"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:26.091102 containerd[1615]: time="2025-09-12T00:36:26.090531809Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:26.091102 containerd[1615]: time="2025-09-12T00:36:26.090888002Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"31928488\" in 1.69199274s" Sep 12 00:36:26.091102 containerd[1615]: time="2025-09-12T00:36:26.090903423Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\"" Sep 12 00:36:26.091319 containerd[1615]: time="2025-09-12T00:36:26.091273769Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 12 00:36:26.942674 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2927441380.mount: Deactivated successfully. Sep 12 00:36:28.231975 containerd[1615]: time="2025-09-12T00:36:28.231687376Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:28.236640 containerd[1615]: time="2025-09-12T00:36:28.236607196Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Sep 12 00:36:28.242011 containerd[1615]: time="2025-09-12T00:36:28.241971300Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:28.246263 containerd[1615]: time="2025-09-12T00:36:28.244574030Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:28.246474 containerd[1615]: time="2025-09-12T00:36:28.246457393Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 2.155167707s" Sep 12 00:36:28.246529 containerd[1615]: time="2025-09-12T00:36:28.246519328Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 12 00:36:28.247547 containerd[1615]: time="2025-09-12T00:36:28.247535846Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 00:36:28.960598 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount655757954.mount: Deactivated successfully. Sep 12 00:36:28.962953 containerd[1615]: time="2025-09-12T00:36:28.962922819Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 00:36:28.963701 containerd[1615]: time="2025-09-12T00:36:28.963680114Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 00:36:28.963960 containerd[1615]: time="2025-09-12T00:36:28.963928153Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 00:36:28.964975 containerd[1615]: time="2025-09-12T00:36:28.964948594Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 00:36:28.965771 containerd[1615]: time="2025-09-12T00:36:28.965371135Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 717.612097ms" Sep 12 00:36:28.965771 containerd[1615]: time="2025-09-12T00:36:28.965389962Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 00:36:28.965921 containerd[1615]: time="2025-09-12T00:36:28.965812957Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 12 00:36:29.683024 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3160182319.mount: Deactivated successfully. Sep 12 00:36:31.529269 update_engine[1588]: I20250912 00:36:31.529188 1588 update_attempter.cc:509] Updating boot flags... Sep 12 00:36:32.489920 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 12 00:36:32.492072 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:36:33.027427 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:36:33.032484 (kubelet)[2387]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 00:36:33.301116 kubelet[2387]: E0912 00:36:33.301003 2387 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 00:36:33.302366 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 00:36:33.302458 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 00:36:33.302817 systemd[1]: kubelet.service: Consumed 114ms CPU time, 109.5M memory peak. Sep 12 00:36:34.645066 containerd[1615]: time="2025-09-12T00:36:34.645021608Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:34.723421 containerd[1615]: time="2025-09-12T00:36:34.723383644Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58378433" Sep 12 00:36:34.727903 containerd[1615]: time="2025-09-12T00:36:34.727856427Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:34.734366 containerd[1615]: time="2025-09-12T00:36:34.733480597Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:34.734366 containerd[1615]: time="2025-09-12T00:36:34.734207968Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 5.768379812s" Sep 12 00:36:34.734366 containerd[1615]: time="2025-09-12T00:36:34.734273084Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 12 00:36:37.411128 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:36:37.411388 systemd[1]: kubelet.service: Consumed 114ms CPU time, 109.5M memory peak. Sep 12 00:36:37.413309 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:36:37.434020 systemd[1]: Reload requested from client PID 2429 ('systemctl') (unit session-9.scope)... Sep 12 00:36:37.434039 systemd[1]: Reloading... Sep 12 00:36:37.508280 zram_generator::config[2472]: No configuration found. Sep 12 00:36:37.587641 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 00:36:37.600634 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 00:36:37.669319 systemd[1]: Reloading finished in 234 ms. Sep 12 00:36:37.709478 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 00:36:37.709540 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 00:36:37.709726 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:36:37.709761 systemd[1]: kubelet.service: Consumed 47ms CPU time, 74.2M memory peak. Sep 12 00:36:37.711009 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:36:38.044142 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:36:38.055542 (kubelet)[2540]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 00:36:38.084267 kubelet[2540]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 00:36:38.084267 kubelet[2540]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 00:36:38.084267 kubelet[2540]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 00:36:38.100817 kubelet[2540]: I0912 00:36:38.100668 2540 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 00:36:38.913086 kubelet[2540]: I0912 00:36:38.913050 2540 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 00:36:38.913086 kubelet[2540]: I0912 00:36:38.913083 2540 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 00:36:38.913288 kubelet[2540]: I0912 00:36:38.913275 2540 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 00:36:39.066091 kubelet[2540]: I0912 00:36:39.066001 2540 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 00:36:39.069776 kubelet[2540]: E0912 00:36:39.069745 2540 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://139.178.70.108:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 12 00:36:39.099108 kubelet[2540]: I0912 00:36:39.099073 2540 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 00:36:39.107585 kubelet[2540]: I0912 00:36:39.107559 2540 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 00:36:39.112552 kubelet[2540]: I0912 00:36:39.112503 2540 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 00:36:39.115313 kubelet[2540]: I0912 00:36:39.112551 2540 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 00:36:39.116191 kubelet[2540]: I0912 00:36:39.116173 2540 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 00:36:39.116191 kubelet[2540]: I0912 00:36:39.116193 2540 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 00:36:39.116325 kubelet[2540]: I0912 00:36:39.116311 2540 state_mem.go:36] "Initialized new in-memory state store" Sep 12 00:36:39.122261 kubelet[2540]: I0912 00:36:39.122223 2540 kubelet.go:480] "Attempting to sync node with API server" Sep 12 00:36:39.122351 kubelet[2540]: I0912 00:36:39.122274 2540 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 00:36:39.123494 kubelet[2540]: I0912 00:36:39.123475 2540 kubelet.go:386] "Adding apiserver pod source" Sep 12 00:36:39.125179 kubelet[2540]: I0912 00:36:39.125157 2540 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 00:36:39.130630 kubelet[2540]: E0912 00:36:39.130329 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.108:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 00:36:39.131268 kubelet[2540]: E0912 00:36:39.131234 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.108:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 00:36:39.131633 kubelet[2540]: I0912 00:36:39.131619 2540 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 12 00:36:39.132543 kubelet[2540]: I0912 00:36:39.131952 2540 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 00:36:39.133857 kubelet[2540]: W0912 00:36:39.133835 2540 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 00:36:39.140483 kubelet[2540]: I0912 00:36:39.140460 2540 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 00:36:39.140567 kubelet[2540]: I0912 00:36:39.140505 2540 server.go:1289] "Started kubelet" Sep 12 00:36:39.146087 kubelet[2540]: I0912 00:36:39.145453 2540 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 00:36:39.147389 kubelet[2540]: I0912 00:36:39.147201 2540 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 00:36:39.147435 kubelet[2540]: I0912 00:36:39.147423 2540 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 00:36:39.147703 kubelet[2540]: I0912 00:36:39.147694 2540 server.go:317] "Adding debug handlers to kubelet server" Sep 12 00:36:39.153133 kubelet[2540]: E0912 00:36:39.149174 2540 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.108:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.108:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.186461e6f3b5bb3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 00:36:39.140481852 +0000 UTC m=+1.081576258,LastTimestamp:2025-09-12 00:36:39.140481852 +0000 UTC m=+1.081576258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 00:36:39.154067 kubelet[2540]: I0912 00:36:39.154051 2540 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 00:36:39.155274 kubelet[2540]: I0912 00:36:39.155180 2540 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 00:36:39.159763 kubelet[2540]: E0912 00:36:39.159604 2540 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 00:36:39.159763 kubelet[2540]: I0912 00:36:39.159626 2540 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 00:36:39.159763 kubelet[2540]: I0912 00:36:39.159751 2540 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 00:36:39.159886 kubelet[2540]: I0912 00:36:39.159790 2540 reconciler.go:26] "Reconciler: start to sync state" Sep 12 00:36:39.160261 kubelet[2540]: E0912 00:36:39.160050 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.108:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 00:36:39.160261 kubelet[2540]: E0912 00:36:39.160175 2540 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.108:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.108:6443: connect: connection refused" interval="200ms" Sep 12 00:36:39.163510 kubelet[2540]: I0912 00:36:39.163435 2540 factory.go:223] Registration of the systemd container factory successfully Sep 12 00:36:39.163585 kubelet[2540]: I0912 00:36:39.163520 2540 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 00:36:39.164905 kubelet[2540]: E0912 00:36:39.164877 2540 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 00:36:39.167458 kubelet[2540]: I0912 00:36:39.167080 2540 factory.go:223] Registration of the containerd container factory successfully Sep 12 00:36:39.180815 kubelet[2540]: I0912 00:36:39.180748 2540 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 00:36:39.180815 kubelet[2540]: I0912 00:36:39.180759 2540 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 00:36:39.180815 kubelet[2540]: I0912 00:36:39.180769 2540 state_mem.go:36] "Initialized new in-memory state store" Sep 12 00:36:39.181188 kubelet[2540]: I0912 00:36:39.181171 2540 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 00:36:39.182311 kubelet[2540]: I0912 00:36:39.182301 2540 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 00:36:39.182559 kubelet[2540]: I0912 00:36:39.182550 2540 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 00:36:39.183105 kubelet[2540]: I0912 00:36:39.183021 2540 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 00:36:39.183105 kubelet[2540]: I0912 00:36:39.183030 2540 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 00:36:39.183105 kubelet[2540]: I0912 00:36:39.182993 2540 policy_none.go:49] "None policy: Start" Sep 12 00:36:39.183105 kubelet[2540]: I0912 00:36:39.183078 2540 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 00:36:39.183105 kubelet[2540]: I0912 00:36:39.183085 2540 state_mem.go:35] "Initializing new in-memory state store" Sep 12 00:36:39.183635 kubelet[2540]: E0912 00:36:39.183052 2540 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 00:36:39.184009 kubelet[2540]: E0912 00:36:39.183997 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.108:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 00:36:39.189855 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 00:36:39.208780 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 00:36:39.212318 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 00:36:39.224196 kubelet[2540]: E0912 00:36:39.224039 2540 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 00:36:39.224196 kubelet[2540]: I0912 00:36:39.224176 2540 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 00:36:39.224420 kubelet[2540]: I0912 00:36:39.224184 2540 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 00:36:39.225883 kubelet[2540]: E0912 00:36:39.225686 2540 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 00:36:39.225883 kubelet[2540]: E0912 00:36:39.225724 2540 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 00:36:39.228850 kubelet[2540]: I0912 00:36:39.228800 2540 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 00:36:39.299951 systemd[1]: Created slice kubepods-burstable-pod6fe031662f7a761b951a9f7b8a184f48.slice - libcontainer container kubepods-burstable-pod6fe031662f7a761b951a9f7b8a184f48.slice. Sep 12 00:36:39.325805 kubelet[2540]: I0912 00:36:39.325715 2540 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:36:39.325967 kubelet[2540]: E0912 00:36:39.325953 2540 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.108:6443/api/v1/nodes\": dial tcp 139.178.70.108:6443: connect: connection refused" node="localhost" Sep 12 00:36:39.326904 kubelet[2540]: E0912 00:36:39.326892 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:39.330110 systemd[1]: Created slice kubepods-burstable-podb678d5c6713e936e66aa5bb73166297e.slice - libcontainer container kubepods-burstable-podb678d5c6713e936e66aa5bb73166297e.slice. Sep 12 00:36:39.344591 kubelet[2540]: E0912 00:36:39.344569 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:39.346972 systemd[1]: Created slice kubepods-burstable-pod7b968cf906b2d9d713a362c43868bef2.slice - libcontainer container kubepods-burstable-pod7b968cf906b2d9d713a362c43868bef2.slice. Sep 12 00:36:39.348739 kubelet[2540]: E0912 00:36:39.348722 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:39.361134 kubelet[2540]: E0912 00:36:39.361090 2540 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.108:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.108:6443: connect: connection refused" interval="400ms" Sep 12 00:36:39.465985 kubelet[2540]: I0912 00:36:39.465720 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6fe031662f7a761b951a9f7b8a184f48-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"6fe031662f7a761b951a9f7b8a184f48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:39.465985 kubelet[2540]: I0912 00:36:39.465789 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:39.465985 kubelet[2540]: I0912 00:36:39.465813 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:39.465985 kubelet[2540]: I0912 00:36:39.465826 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:39.465985 kubelet[2540]: I0912 00:36:39.465836 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:39.466130 kubelet[2540]: I0912 00:36:39.465845 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7b968cf906b2d9d713a362c43868bef2-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"7b968cf906b2d9d713a362c43868bef2\") " pod="kube-system/kube-scheduler-localhost" Sep 12 00:36:39.466130 kubelet[2540]: I0912 00:36:39.465855 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6fe031662f7a761b951a9f7b8a184f48-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"6fe031662f7a761b951a9f7b8a184f48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:39.466130 kubelet[2540]: I0912 00:36:39.465863 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6fe031662f7a761b951a9f7b8a184f48-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"6fe031662f7a761b951a9f7b8a184f48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:39.466130 kubelet[2540]: I0912 00:36:39.465871 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:39.527514 kubelet[2540]: I0912 00:36:39.527495 2540 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:36:39.527771 kubelet[2540]: E0912 00:36:39.527735 2540 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.108:6443/api/v1/nodes\": dial tcp 139.178.70.108:6443: connect: connection refused" node="localhost" Sep 12 00:36:39.628161 containerd[1615]: time="2025-09-12T00:36:39.628127841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:6fe031662f7a761b951a9f7b8a184f48,Namespace:kube-system,Attempt:0,}" Sep 12 00:36:39.661243 containerd[1615]: time="2025-09-12T00:36:39.661019549Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:7b968cf906b2d9d713a362c43868bef2,Namespace:kube-system,Attempt:0,}" Sep 12 00:36:39.662760 containerd[1615]: time="2025-09-12T00:36:39.662732711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b678d5c6713e936e66aa5bb73166297e,Namespace:kube-system,Attempt:0,}" Sep 12 00:36:39.761485 kubelet[2540]: E0912 00:36:39.761427 2540 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.108:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.108:6443: connect: connection refused" interval="800ms" Sep 12 00:36:39.774260 containerd[1615]: time="2025-09-12T00:36:39.774039502Z" level=info msg="connecting to shim 9f4492cc18d4392a96353b486e96f938c7f9c543af287945db24826c00d63d14" address="unix:///run/containerd/s/3259d5cad32864a88351b048cde68363b388de8a42c923397af3d5c3e3fcf2b6" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:36:39.774414 containerd[1615]: time="2025-09-12T00:36:39.774399532Z" level=info msg="connecting to shim 9e8adb0c0797821c5e374bcedb4fc84976ce82abbd5a9b257b750d3198675f64" address="unix:///run/containerd/s/b1520b7d444ea6b7e7e3cb20e7c88812721d04f942baacb03d4904762ea14f57" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:36:39.775280 containerd[1615]: time="2025-09-12T00:36:39.775261973Z" level=info msg="connecting to shim 8285ca8a6559a5f55649fd7ac42a01824e3885b53d68f85ae28cc58c1a350767" address="unix:///run/containerd/s/b53bdba1137fbc22a3391264c144b0f1d5830d5233d416fba25e47aaeb7a7524" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:36:39.855415 systemd[1]: Started cri-containerd-8285ca8a6559a5f55649fd7ac42a01824e3885b53d68f85ae28cc58c1a350767.scope - libcontainer container 8285ca8a6559a5f55649fd7ac42a01824e3885b53d68f85ae28cc58c1a350767. Sep 12 00:36:39.861116 systemd[1]: Started cri-containerd-9e8adb0c0797821c5e374bcedb4fc84976ce82abbd5a9b257b750d3198675f64.scope - libcontainer container 9e8adb0c0797821c5e374bcedb4fc84976ce82abbd5a9b257b750d3198675f64. Sep 12 00:36:39.869471 systemd[1]: Started cri-containerd-9f4492cc18d4392a96353b486e96f938c7f9c543af287945db24826c00d63d14.scope - libcontainer container 9f4492cc18d4392a96353b486e96f938c7f9c543af287945db24826c00d63d14. Sep 12 00:36:39.926428 containerd[1615]: time="2025-09-12T00:36:39.926343247Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b678d5c6713e936e66aa5bb73166297e,Namespace:kube-system,Attempt:0,} returns sandbox id \"9e8adb0c0797821c5e374bcedb4fc84976ce82abbd5a9b257b750d3198675f64\"" Sep 12 00:36:39.929589 kubelet[2540]: I0912 00:36:39.929571 2540 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:36:39.930269 kubelet[2540]: E0912 00:36:39.930059 2540 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.108:6443/api/v1/nodes\": dial tcp 139.178.70.108:6443: connect: connection refused" node="localhost" Sep 12 00:36:39.943549 containerd[1615]: time="2025-09-12T00:36:39.943527602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:6fe031662f7a761b951a9f7b8a184f48,Namespace:kube-system,Attempt:0,} returns sandbox id \"9f4492cc18d4392a96353b486e96f938c7f9c543af287945db24826c00d63d14\"" Sep 12 00:36:39.944270 containerd[1615]: time="2025-09-12T00:36:39.943865963Z" level=info msg="CreateContainer within sandbox \"9e8adb0c0797821c5e374bcedb4fc84976ce82abbd5a9b257b750d3198675f64\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 00:36:39.949976 containerd[1615]: time="2025-09-12T00:36:39.949672887Z" level=info msg="CreateContainer within sandbox \"9f4492cc18d4392a96353b486e96f938c7f9c543af287945db24826c00d63d14\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 00:36:39.950280 containerd[1615]: time="2025-09-12T00:36:39.950266990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:7b968cf906b2d9d713a362c43868bef2,Namespace:kube-system,Attempt:0,} returns sandbox id \"8285ca8a6559a5f55649fd7ac42a01824e3885b53d68f85ae28cc58c1a350767\"" Sep 12 00:36:39.952750 containerd[1615]: time="2025-09-12T00:36:39.952728579Z" level=info msg="CreateContainer within sandbox \"8285ca8a6559a5f55649fd7ac42a01824e3885b53d68f85ae28cc58c1a350767\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 00:36:39.956740 containerd[1615]: time="2025-09-12T00:36:39.956721005Z" level=info msg="Container 8bed128444e11682ee5d8bdaa00f339efe285ef905dbe0a2be696d56d4219e3e: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:36:39.957351 containerd[1615]: time="2025-09-12T00:36:39.957329937Z" level=info msg="Container 2aa62075ad41eee52e9bca25273eb3ea36a6938ea930b2a6ba521cec5d19acaa: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:36:39.958489 containerd[1615]: time="2025-09-12T00:36:39.958476732Z" level=info msg="Container e60f62b0bc62ec3f8fb7db4e814aec24e7ff3d859b10f324a23907d929b0bc39: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:36:39.963378 containerd[1615]: time="2025-09-12T00:36:39.963352452Z" level=info msg="CreateContainer within sandbox \"9e8adb0c0797821c5e374bcedb4fc84976ce82abbd5a9b257b750d3198675f64\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8bed128444e11682ee5d8bdaa00f339efe285ef905dbe0a2be696d56d4219e3e\"" Sep 12 00:36:39.964487 containerd[1615]: time="2025-09-12T00:36:39.964473588Z" level=info msg="StartContainer for \"8bed128444e11682ee5d8bdaa00f339efe285ef905dbe0a2be696d56d4219e3e\"" Sep 12 00:36:39.964875 containerd[1615]: time="2025-09-12T00:36:39.964855664Z" level=info msg="CreateContainer within sandbox \"9f4492cc18d4392a96353b486e96f938c7f9c543af287945db24826c00d63d14\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"2aa62075ad41eee52e9bca25273eb3ea36a6938ea930b2a6ba521cec5d19acaa\"" Sep 12 00:36:39.966113 containerd[1615]: time="2025-09-12T00:36:39.965762947Z" level=info msg="CreateContainer within sandbox \"8285ca8a6559a5f55649fd7ac42a01824e3885b53d68f85ae28cc58c1a350767\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e60f62b0bc62ec3f8fb7db4e814aec24e7ff3d859b10f324a23907d929b0bc39\"" Sep 12 00:36:39.968193 containerd[1615]: time="2025-09-12T00:36:39.968169726Z" level=info msg="StartContainer for \"2aa62075ad41eee52e9bca25273eb3ea36a6938ea930b2a6ba521cec5d19acaa\"" Sep 12 00:36:39.968928 containerd[1615]: time="2025-09-12T00:36:39.968907141Z" level=info msg="connecting to shim 8bed128444e11682ee5d8bdaa00f339efe285ef905dbe0a2be696d56d4219e3e" address="unix:///run/containerd/s/b1520b7d444ea6b7e7e3cb20e7c88812721d04f942baacb03d4904762ea14f57" protocol=ttrpc version=3 Sep 12 00:36:39.969399 containerd[1615]: time="2025-09-12T00:36:39.969382498Z" level=info msg="connecting to shim 2aa62075ad41eee52e9bca25273eb3ea36a6938ea930b2a6ba521cec5d19acaa" address="unix:///run/containerd/s/3259d5cad32864a88351b048cde68363b388de8a42c923397af3d5c3e3fcf2b6" protocol=ttrpc version=3 Sep 12 00:36:39.969906 containerd[1615]: time="2025-09-12T00:36:39.969892625Z" level=info msg="StartContainer for \"e60f62b0bc62ec3f8fb7db4e814aec24e7ff3d859b10f324a23907d929b0bc39\"" Sep 12 00:36:39.971513 containerd[1615]: time="2025-09-12T00:36:39.971489148Z" level=info msg="connecting to shim e60f62b0bc62ec3f8fb7db4e814aec24e7ff3d859b10f324a23907d929b0bc39" address="unix:///run/containerd/s/b53bdba1137fbc22a3391264c144b0f1d5830d5233d416fba25e47aaeb7a7524" protocol=ttrpc version=3 Sep 12 00:36:39.985524 systemd[1]: Started cri-containerd-2aa62075ad41eee52e9bca25273eb3ea36a6938ea930b2a6ba521cec5d19acaa.scope - libcontainer container 2aa62075ad41eee52e9bca25273eb3ea36a6938ea930b2a6ba521cec5d19acaa. Sep 12 00:36:39.987886 kubelet[2540]: E0912 00:36:39.987670 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.108:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 00:36:39.995507 systemd[1]: Started cri-containerd-8bed128444e11682ee5d8bdaa00f339efe285ef905dbe0a2be696d56d4219e3e.scope - libcontainer container 8bed128444e11682ee5d8bdaa00f339efe285ef905dbe0a2be696d56d4219e3e. Sep 12 00:36:39.997299 systemd[1]: Started cri-containerd-e60f62b0bc62ec3f8fb7db4e814aec24e7ff3d859b10f324a23907d929b0bc39.scope - libcontainer container e60f62b0bc62ec3f8fb7db4e814aec24e7ff3d859b10f324a23907d929b0bc39. Sep 12 00:36:40.041078 containerd[1615]: time="2025-09-12T00:36:40.040547652Z" level=info msg="StartContainer for \"e60f62b0bc62ec3f8fb7db4e814aec24e7ff3d859b10f324a23907d929b0bc39\" returns successfully" Sep 12 00:36:40.064940 containerd[1615]: time="2025-09-12T00:36:40.064916185Z" level=info msg="StartContainer for \"2aa62075ad41eee52e9bca25273eb3ea36a6938ea930b2a6ba521cec5d19acaa\" returns successfully" Sep 12 00:36:40.071460 containerd[1615]: time="2025-09-12T00:36:40.071437636Z" level=info msg="StartContainer for \"8bed128444e11682ee5d8bdaa00f339efe285ef905dbe0a2be696d56d4219e3e\" returns successfully" Sep 12 00:36:40.188672 kubelet[2540]: E0912 00:36:40.188654 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:40.191243 kubelet[2540]: E0912 00:36:40.191226 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:40.192492 kubelet[2540]: E0912 00:36:40.192481 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:40.246085 kubelet[2540]: E0912 00:36:40.246058 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.108:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 00:36:40.356031 kubelet[2540]: E0912 00:36:40.355941 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.108:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 00:36:40.535978 kubelet[2540]: E0912 00:36:40.535946 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.108:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.108:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 00:36:40.563048 kubelet[2540]: E0912 00:36:40.563002 2540 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.108:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.108:6443: connect: connection refused" interval="1.6s" Sep 12 00:36:40.731978 kubelet[2540]: I0912 00:36:40.731757 2540 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:36:41.193773 kubelet[2540]: E0912 00:36:41.193744 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:41.193996 kubelet[2540]: E0912 00:36:41.193928 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:41.194207 kubelet[2540]: E0912 00:36:41.194197 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:42.209804 kubelet[2540]: E0912 00:36:42.209704 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:36:42.378454 kubelet[2540]: E0912 00:36:42.378423 2540 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 00:36:42.505888 kubelet[2540]: I0912 00:36:42.505863 2540 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 00:36:42.561061 kubelet[2540]: I0912 00:36:42.561007 2540 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 00:36:42.615029 kubelet[2540]: E0912 00:36:42.615002 2540 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 12 00:36:42.615029 kubelet[2540]: I0912 00:36:42.615026 2540 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:42.616832 kubelet[2540]: E0912 00:36:42.616470 2540 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:42.616832 kubelet[2540]: I0912 00:36:42.616485 2540 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:42.617647 kubelet[2540]: E0912 00:36:42.617631 2540 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:43.079831 kubelet[2540]: I0912 00:36:43.079810 2540 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 00:36:43.081047 kubelet[2540]: E0912 00:36:43.080993 2540 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 12 00:36:43.133718 kubelet[2540]: I0912 00:36:43.133680 2540 apiserver.go:52] "Watching apiserver" Sep 12 00:36:43.160071 kubelet[2540]: I0912 00:36:43.160033 2540 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 00:36:44.168523 systemd[1]: Reload requested from client PID 2821 ('systemctl') (unit session-9.scope)... Sep 12 00:36:44.168545 systemd[1]: Reloading... Sep 12 00:36:44.234285 zram_generator::config[2864]: No configuration found. Sep 12 00:36:44.301916 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 00:36:44.309947 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Sep 12 00:36:44.387821 systemd[1]: Reloading finished in 219 ms. Sep 12 00:36:44.405816 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:36:44.419997 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 00:36:44.420298 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:36:44.420349 systemd[1]: kubelet.service: Consumed 1.075s CPU time, 129M memory peak. Sep 12 00:36:44.423151 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:36:44.944478 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:36:44.953684 (kubelet)[2932]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 00:36:45.008264 kubelet[2932]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 00:36:45.008264 kubelet[2932]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 00:36:45.008264 kubelet[2932]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 00:36:45.008264 kubelet[2932]: I0912 00:36:45.007383 2932 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 00:36:45.015163 kubelet[2932]: I0912 00:36:45.015128 2932 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 00:36:45.015163 kubelet[2932]: I0912 00:36:45.015157 2932 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 00:36:45.015408 kubelet[2932]: I0912 00:36:45.015394 2932 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 00:36:45.017552 kubelet[2932]: I0912 00:36:45.017527 2932 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 12 00:36:45.025600 kubelet[2932]: I0912 00:36:45.025356 2932 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 00:36:45.027471 kubelet[2932]: I0912 00:36:45.027459 2932 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 00:36:45.031551 kubelet[2932]: I0912 00:36:45.031522 2932 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 00:36:45.033954 kubelet[2932]: I0912 00:36:45.033918 2932 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 00:36:45.034119 kubelet[2932]: I0912 00:36:45.033956 2932 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 00:36:45.034187 kubelet[2932]: I0912 00:36:45.034122 2932 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 00:36:45.034187 kubelet[2932]: I0912 00:36:45.034135 2932 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 00:36:45.034187 kubelet[2932]: I0912 00:36:45.034181 2932 state_mem.go:36] "Initialized new in-memory state store" Sep 12 00:36:45.034432 kubelet[2932]: I0912 00:36:45.034403 2932 kubelet.go:480] "Attempting to sync node with API server" Sep 12 00:36:45.034432 kubelet[2932]: I0912 00:36:45.034418 2932 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 00:36:45.035293 kubelet[2932]: I0912 00:36:45.034439 2932 kubelet.go:386] "Adding apiserver pod source" Sep 12 00:36:45.035293 kubelet[2932]: I0912 00:36:45.034451 2932 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 00:36:45.039550 kubelet[2932]: I0912 00:36:45.039500 2932 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 12 00:36:45.041039 kubelet[2932]: I0912 00:36:45.041027 2932 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 00:36:45.050000 kubelet[2932]: I0912 00:36:45.049391 2932 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 00:36:45.050000 kubelet[2932]: I0912 00:36:45.049427 2932 server.go:1289] "Started kubelet" Sep 12 00:36:45.050744 kubelet[2932]: I0912 00:36:45.049930 2932 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 00:36:45.050744 kubelet[2932]: I0912 00:36:45.050435 2932 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 00:36:45.050744 kubelet[2932]: I0912 00:36:45.050477 2932 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 00:36:45.052131 kubelet[2932]: I0912 00:36:45.051900 2932 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 00:36:45.053433 kubelet[2932]: I0912 00:36:45.053415 2932 server.go:317] "Adding debug handlers to kubelet server" Sep 12 00:36:45.057876 kubelet[2932]: I0912 00:36:45.057849 2932 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 00:36:45.060862 kubelet[2932]: I0912 00:36:45.060476 2932 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 00:36:45.061700 kubelet[2932]: I0912 00:36:45.061685 2932 factory.go:223] Registration of the systemd container factory successfully Sep 12 00:36:45.061789 kubelet[2932]: I0912 00:36:45.061773 2932 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 00:36:45.063633 kubelet[2932]: I0912 00:36:45.063589 2932 factory.go:223] Registration of the containerd container factory successfully Sep 12 00:36:45.064383 kubelet[2932]: I0912 00:36:45.064373 2932 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 00:36:45.064667 kubelet[2932]: I0912 00:36:45.064649 2932 reconciler.go:26] "Reconciler: start to sync state" Sep 12 00:36:45.068282 kubelet[2932]: I0912 00:36:45.067658 2932 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 00:36:45.069276 kubelet[2932]: I0912 00:36:45.069157 2932 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 00:36:45.069276 kubelet[2932]: I0912 00:36:45.069175 2932 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 00:36:45.069276 kubelet[2932]: I0912 00:36:45.069193 2932 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 00:36:45.069276 kubelet[2932]: I0912 00:36:45.069197 2932 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 00:36:45.069276 kubelet[2932]: E0912 00:36:45.069223 2932 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 00:36:45.076762 kubelet[2932]: E0912 00:36:45.076730 2932 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 00:36:45.102785 kubelet[2932]: I0912 00:36:45.102769 2932 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 00:36:45.103094 kubelet[2932]: I0912 00:36:45.102875 2932 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 00:36:45.103094 kubelet[2932]: I0912 00:36:45.102889 2932 state_mem.go:36] "Initialized new in-memory state store" Sep 12 00:36:45.103094 kubelet[2932]: I0912 00:36:45.102970 2932 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 00:36:45.103094 kubelet[2932]: I0912 00:36:45.102975 2932 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 00:36:45.103094 kubelet[2932]: I0912 00:36:45.102987 2932 policy_none.go:49] "None policy: Start" Sep 12 00:36:45.103094 kubelet[2932]: I0912 00:36:45.102993 2932 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 00:36:45.103094 kubelet[2932]: I0912 00:36:45.102999 2932 state_mem.go:35] "Initializing new in-memory state store" Sep 12 00:36:45.103094 kubelet[2932]: I0912 00:36:45.103054 2932 state_mem.go:75] "Updated machine memory state" Sep 12 00:36:45.106161 kubelet[2932]: E0912 00:36:45.106149 2932 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 00:36:45.106749 kubelet[2932]: I0912 00:36:45.106543 2932 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 00:36:45.106749 kubelet[2932]: I0912 00:36:45.106551 2932 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 00:36:45.106749 kubelet[2932]: I0912 00:36:45.106679 2932 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 00:36:45.107666 kubelet[2932]: E0912 00:36:45.107513 2932 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 00:36:45.170582 kubelet[2932]: I0912 00:36:45.170551 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 00:36:45.170746 kubelet[2932]: I0912 00:36:45.170553 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:45.170799 kubelet[2932]: I0912 00:36:45.170638 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:45.211288 kubelet[2932]: I0912 00:36:45.211176 2932 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:36:45.232749 kubelet[2932]: I0912 00:36:45.232724 2932 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 12 00:36:45.232838 kubelet[2932]: I0912 00:36:45.232784 2932 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 00:36:45.366567 kubelet[2932]: I0912 00:36:45.366524 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6fe031662f7a761b951a9f7b8a184f48-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"6fe031662f7a761b951a9f7b8a184f48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:45.366761 kubelet[2932]: I0912 00:36:45.366667 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:45.366761 kubelet[2932]: I0912 00:36:45.366683 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:45.366761 kubelet[2932]: I0912 00:36:45.366695 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:45.366761 kubelet[2932]: I0912 00:36:45.366704 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7b968cf906b2d9d713a362c43868bef2-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"7b968cf906b2d9d713a362c43868bef2\") " pod="kube-system/kube-scheduler-localhost" Sep 12 00:36:45.366761 kubelet[2932]: I0912 00:36:45.366713 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6fe031662f7a761b951a9f7b8a184f48-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"6fe031662f7a761b951a9f7b8a184f48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:45.366914 kubelet[2932]: I0912 00:36:45.366874 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:45.366914 kubelet[2932]: I0912 00:36:45.366887 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:36:45.366914 kubelet[2932]: I0912 00:36:45.366896 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6fe031662f7a761b951a9f7b8a184f48-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"6fe031662f7a761b951a9f7b8a184f48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:46.037703 kubelet[2932]: I0912 00:36:46.037667 2932 apiserver.go:52] "Watching apiserver" Sep 12 00:36:46.064697 kubelet[2932]: I0912 00:36:46.064654 2932 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 00:36:46.097422 kubelet[2932]: I0912 00:36:46.097392 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:46.104066 kubelet[2932]: E0912 00:36:46.103887 2932 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 12 00:36:46.133065 kubelet[2932]: I0912 00:36:46.133017 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.133006154 podStartE2EDuration="1.133006154s" podCreationTimestamp="2025-09-12 00:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:36:46.119002106 +0000 UTC m=+1.150424324" watchObservedRunningTime="2025-09-12 00:36:46.133006154 +0000 UTC m=+1.164428373" Sep 12 00:36:46.146028 kubelet[2932]: I0912 00:36:46.145948 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.145934649 podStartE2EDuration="1.145934649s" podCreationTimestamp="2025-09-12 00:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:36:46.133093053 +0000 UTC m=+1.164515260" watchObservedRunningTime="2025-09-12 00:36:46.145934649 +0000 UTC m=+1.177356865" Sep 12 00:36:46.155397 kubelet[2932]: I0912 00:36:46.155117 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.155100354 podStartE2EDuration="1.155100354s" podCreationTimestamp="2025-09-12 00:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:36:46.146292581 +0000 UTC m=+1.177714794" watchObservedRunningTime="2025-09-12 00:36:46.155100354 +0000 UTC m=+1.186522573" Sep 12 00:36:51.160489 kubelet[2932]: I0912 00:36:51.160426 2932 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 00:36:51.161216 kubelet[2932]: I0912 00:36:51.160742 2932 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 00:36:51.161244 containerd[1615]: time="2025-09-12T00:36:51.160616587Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 00:36:52.056829 systemd[1]: Created slice kubepods-besteffort-pod13e0d74d_3160_4c20_9d55_12ec6ee6e622.slice - libcontainer container kubepods-besteffort-pod13e0d74d_3160_4c20_9d55_12ec6ee6e622.slice. Sep 12 00:36:52.105154 kubelet[2932]: I0912 00:36:52.105084 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzxjl\" (UniqueName: \"kubernetes.io/projected/13e0d74d-3160-4c20-9d55-12ec6ee6e622-kube-api-access-vzxjl\") pod \"kube-proxy-lq64r\" (UID: \"13e0d74d-3160-4c20-9d55-12ec6ee6e622\") " pod="kube-system/kube-proxy-lq64r" Sep 12 00:36:52.105154 kubelet[2932]: I0912 00:36:52.105114 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/13e0d74d-3160-4c20-9d55-12ec6ee6e622-kube-proxy\") pod \"kube-proxy-lq64r\" (UID: \"13e0d74d-3160-4c20-9d55-12ec6ee6e622\") " pod="kube-system/kube-proxy-lq64r" Sep 12 00:36:52.105154 kubelet[2932]: I0912 00:36:52.105129 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/13e0d74d-3160-4c20-9d55-12ec6ee6e622-xtables-lock\") pod \"kube-proxy-lq64r\" (UID: \"13e0d74d-3160-4c20-9d55-12ec6ee6e622\") " pod="kube-system/kube-proxy-lq64r" Sep 12 00:36:52.105354 kubelet[2932]: I0912 00:36:52.105137 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13e0d74d-3160-4c20-9d55-12ec6ee6e622-lib-modules\") pod \"kube-proxy-lq64r\" (UID: \"13e0d74d-3160-4c20-9d55-12ec6ee6e622\") " pod="kube-system/kube-proxy-lq64r" Sep 12 00:36:52.226632 kubelet[2932]: E0912 00:36:52.226606 2932 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 12 00:36:52.226632 kubelet[2932]: E0912 00:36:52.226628 2932 projected.go:194] Error preparing data for projected volume kube-api-access-vzxjl for pod kube-system/kube-proxy-lq64r: configmap "kube-root-ca.crt" not found Sep 12 00:36:52.226888 kubelet[2932]: E0912 00:36:52.226676 2932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13e0d74d-3160-4c20-9d55-12ec6ee6e622-kube-api-access-vzxjl podName:13e0d74d-3160-4c20-9d55-12ec6ee6e622 nodeName:}" failed. No retries permitted until 2025-09-12 00:36:52.726657525 +0000 UTC m=+7.758079735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vzxjl" (UniqueName: "kubernetes.io/projected/13e0d74d-3160-4c20-9d55-12ec6ee6e622-kube-api-access-vzxjl") pod "kube-proxy-lq64r" (UID: "13e0d74d-3160-4c20-9d55-12ec6ee6e622") : configmap "kube-root-ca.crt" not found Sep 12 00:36:52.330982 systemd[1]: Created slice kubepods-besteffort-pod0a36819d_58a3_4ab2_8ea6_32382c162d83.slice - libcontainer container kubepods-besteffort-pod0a36819d_58a3_4ab2_8ea6_32382c162d83.slice. Sep 12 00:36:52.407435 kubelet[2932]: I0912 00:36:52.407377 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0a36819d-58a3-4ab2-8ea6-32382c162d83-var-lib-calico\") pod \"tigera-operator-755d956888-dxv5r\" (UID: \"0a36819d-58a3-4ab2-8ea6-32382c162d83\") " pod="tigera-operator/tigera-operator-755d956888-dxv5r" Sep 12 00:36:52.407435 kubelet[2932]: I0912 00:36:52.407434 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmvth\" (UniqueName: \"kubernetes.io/projected/0a36819d-58a3-4ab2-8ea6-32382c162d83-kube-api-access-tmvth\") pod \"tigera-operator-755d956888-dxv5r\" (UID: \"0a36819d-58a3-4ab2-8ea6-32382c162d83\") " pod="tigera-operator/tigera-operator-755d956888-dxv5r" Sep 12 00:36:52.636165 containerd[1615]: time="2025-09-12T00:36:52.635707574Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-dxv5r,Uid:0a36819d-58a3-4ab2-8ea6-32382c162d83,Namespace:tigera-operator,Attempt:0,}" Sep 12 00:36:52.653678 containerd[1615]: time="2025-09-12T00:36:52.653640306Z" level=info msg="connecting to shim a91230e3ca1fa6c6b54ccd5638853f91d8640f2534e84bdf02144565151add9d" address="unix:///run/containerd/s/19c69d464607429f5c3b8556f1823d1e67d01964731a5ad6ba0ecde245ef1a4d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:36:52.679432 systemd[1]: Started cri-containerd-a91230e3ca1fa6c6b54ccd5638853f91d8640f2534e84bdf02144565151add9d.scope - libcontainer container a91230e3ca1fa6c6b54ccd5638853f91d8640f2534e84bdf02144565151add9d. Sep 12 00:36:52.716233 containerd[1615]: time="2025-09-12T00:36:52.716180849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-dxv5r,Uid:0a36819d-58a3-4ab2-8ea6-32382c162d83,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a91230e3ca1fa6c6b54ccd5638853f91d8640f2534e84bdf02144565151add9d\"" Sep 12 00:36:52.717524 containerd[1615]: time="2025-09-12T00:36:52.717401818Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 00:36:52.965539 containerd[1615]: time="2025-09-12T00:36:52.965376123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lq64r,Uid:13e0d74d-3160-4c20-9d55-12ec6ee6e622,Namespace:kube-system,Attempt:0,}" Sep 12 00:36:52.989841 containerd[1615]: time="2025-09-12T00:36:52.989735884Z" level=info msg="connecting to shim 07fa8532fda73c062359cda01c49a989be5dbf47cfb12b3c26695c986e239ff7" address="unix:///run/containerd/s/3b7bd8f7da77cb4bf603fd2975af3e824ffd69ce968be79757acc28f9c64f0b0" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:36:53.016464 systemd[1]: Started cri-containerd-07fa8532fda73c062359cda01c49a989be5dbf47cfb12b3c26695c986e239ff7.scope - libcontainer container 07fa8532fda73c062359cda01c49a989be5dbf47cfb12b3c26695c986e239ff7. Sep 12 00:36:53.076750 containerd[1615]: time="2025-09-12T00:36:53.076718017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lq64r,Uid:13e0d74d-3160-4c20-9d55-12ec6ee6e622,Namespace:kube-system,Attempt:0,} returns sandbox id \"07fa8532fda73c062359cda01c49a989be5dbf47cfb12b3c26695c986e239ff7\"" Sep 12 00:36:53.079341 containerd[1615]: time="2025-09-12T00:36:53.079275063Z" level=info msg="CreateContainer within sandbox \"07fa8532fda73c062359cda01c49a989be5dbf47cfb12b3c26695c986e239ff7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 00:36:53.085414 containerd[1615]: time="2025-09-12T00:36:53.085371157Z" level=info msg="Container a79360595a24a0e2e2f91d81bb0a6e39ed8766dd015c703af20627eba7b98303: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:36:53.089884 containerd[1615]: time="2025-09-12T00:36:53.089852468Z" level=info msg="CreateContainer within sandbox \"07fa8532fda73c062359cda01c49a989be5dbf47cfb12b3c26695c986e239ff7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a79360595a24a0e2e2f91d81bb0a6e39ed8766dd015c703af20627eba7b98303\"" Sep 12 00:36:53.090629 containerd[1615]: time="2025-09-12T00:36:53.090498746Z" level=info msg="StartContainer for \"a79360595a24a0e2e2f91d81bb0a6e39ed8766dd015c703af20627eba7b98303\"" Sep 12 00:36:53.094351 containerd[1615]: time="2025-09-12T00:36:53.094313334Z" level=info msg="connecting to shim a79360595a24a0e2e2f91d81bb0a6e39ed8766dd015c703af20627eba7b98303" address="unix:///run/containerd/s/3b7bd8f7da77cb4bf603fd2975af3e824ffd69ce968be79757acc28f9c64f0b0" protocol=ttrpc version=3 Sep 12 00:36:53.113595 systemd[1]: Started cri-containerd-a79360595a24a0e2e2f91d81bb0a6e39ed8766dd015c703af20627eba7b98303.scope - libcontainer container a79360595a24a0e2e2f91d81bb0a6e39ed8766dd015c703af20627eba7b98303. Sep 12 00:36:53.165805 containerd[1615]: time="2025-09-12T00:36:53.165731101Z" level=info msg="StartContainer for \"a79360595a24a0e2e2f91d81bb0a6e39ed8766dd015c703af20627eba7b98303\" returns successfully" Sep 12 00:36:54.642011 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1159353386.mount: Deactivated successfully. Sep 12 00:36:56.046955 containerd[1615]: time="2025-09-12T00:36:56.046918103Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:56.051594 containerd[1615]: time="2025-09-12T00:36:56.051572959Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 00:36:56.054778 containerd[1615]: time="2025-09-12T00:36:56.054479306Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:56.060455 containerd[1615]: time="2025-09-12T00:36:56.060429208Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:36:56.061224 containerd[1615]: time="2025-09-12T00:36:56.061209663Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.343768481s" Sep 12 00:36:56.061327 containerd[1615]: time="2025-09-12T00:36:56.061317844Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 00:36:56.106111 containerd[1615]: time="2025-09-12T00:36:56.106076644Z" level=info msg="CreateContainer within sandbox \"a91230e3ca1fa6c6b54ccd5638853f91d8640f2534e84bdf02144565151add9d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 00:36:56.165902 containerd[1615]: time="2025-09-12T00:36:56.165877555Z" level=info msg="Container b19363ce65bae44250d4020acd00d0d714e5d6b4d09f973d252eef53154a440f: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:36:56.167969 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1437084034.mount: Deactivated successfully. Sep 12 00:36:56.200681 containerd[1615]: time="2025-09-12T00:36:56.200650716Z" level=info msg="CreateContainer within sandbox \"a91230e3ca1fa6c6b54ccd5638853f91d8640f2534e84bdf02144565151add9d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b19363ce65bae44250d4020acd00d0d714e5d6b4d09f973d252eef53154a440f\"" Sep 12 00:36:56.203212 containerd[1615]: time="2025-09-12T00:36:56.201815674Z" level=info msg="StartContainer for \"b19363ce65bae44250d4020acd00d0d714e5d6b4d09f973d252eef53154a440f\"" Sep 12 00:36:56.204387 containerd[1615]: time="2025-09-12T00:36:56.204356743Z" level=info msg="connecting to shim b19363ce65bae44250d4020acd00d0d714e5d6b4d09f973d252eef53154a440f" address="unix:///run/containerd/s/19c69d464607429f5c3b8556f1823d1e67d01964731a5ad6ba0ecde245ef1a4d" protocol=ttrpc version=3 Sep 12 00:36:56.226389 systemd[1]: Started cri-containerd-b19363ce65bae44250d4020acd00d0d714e5d6b4d09f973d252eef53154a440f.scope - libcontainer container b19363ce65bae44250d4020acd00d0d714e5d6b4d09f973d252eef53154a440f. Sep 12 00:36:56.252605 containerd[1615]: time="2025-09-12T00:36:56.252584484Z" level=info msg="StartContainer for \"b19363ce65bae44250d4020acd00d0d714e5d6b4d09f973d252eef53154a440f\" returns successfully" Sep 12 00:36:57.141697 kubelet[2932]: I0912 00:36:57.141588 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-lq64r" podStartSLOduration=5.141571483 podStartE2EDuration="5.141571483s" podCreationTimestamp="2025-09-12 00:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:36:54.155889763 +0000 UTC m=+9.187311983" watchObservedRunningTime="2025-09-12 00:36:57.141571483 +0000 UTC m=+12.172993702" Sep 12 00:36:59.119950 kubelet[2932]: I0912 00:36:59.119874 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-dxv5r" podStartSLOduration=3.766944149 podStartE2EDuration="7.119269867s" podCreationTimestamp="2025-09-12 00:36:52 +0000 UTC" firstStartedPulling="2025-09-12 00:36:52.717115518 +0000 UTC m=+7.748537726" lastFinishedPulling="2025-09-12 00:36:56.069441237 +0000 UTC m=+11.100863444" observedRunningTime="2025-09-12 00:36:57.142179091 +0000 UTC m=+12.173601318" watchObservedRunningTime="2025-09-12 00:36:59.119269867 +0000 UTC m=+14.150692087" Sep 12 00:37:02.169880 sudo[1949]: pam_unix(sudo:session): session closed for user root Sep 12 00:37:02.171591 sshd[1948]: Connection closed by 139.178.68.195 port 55394 Sep 12 00:37:02.171516 sshd-session[1946]: pam_unix(sshd:session): session closed for user core Sep 12 00:37:02.176423 systemd[1]: sshd@6-139.178.70.108:22-139.178.68.195:55394.service: Deactivated successfully. Sep 12 00:37:02.179556 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 00:37:02.179819 systemd[1]: session-9.scope: Consumed 3.600s CPU time, 153.4M memory peak. Sep 12 00:37:02.182389 systemd-logind[1587]: Session 9 logged out. Waiting for processes to exit. Sep 12 00:37:02.184373 systemd-logind[1587]: Removed session 9. Sep 12 00:37:04.621924 systemd[1]: Created slice kubepods-besteffort-podae524881_d535_4c39_836c_19ab2bd898da.slice - libcontainer container kubepods-besteffort-podae524881_d535_4c39_836c_19ab2bd898da.slice. Sep 12 00:37:04.690720 kubelet[2932]: I0912 00:37:04.690625 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae524881-d535-4c39-836c-19ab2bd898da-tigera-ca-bundle\") pod \"calico-typha-57cdbd6888-9m8lm\" (UID: \"ae524881-d535-4c39-836c-19ab2bd898da\") " pod="calico-system/calico-typha-57cdbd6888-9m8lm" Sep 12 00:37:04.690720 kubelet[2932]: I0912 00:37:04.690678 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwwbm\" (UniqueName: \"kubernetes.io/projected/ae524881-d535-4c39-836c-19ab2bd898da-kube-api-access-rwwbm\") pod \"calico-typha-57cdbd6888-9m8lm\" (UID: \"ae524881-d535-4c39-836c-19ab2bd898da\") " pod="calico-system/calico-typha-57cdbd6888-9m8lm" Sep 12 00:37:04.690720 kubelet[2932]: I0912 00:37:04.690692 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ae524881-d535-4c39-836c-19ab2bd898da-typha-certs\") pod \"calico-typha-57cdbd6888-9m8lm\" (UID: \"ae524881-d535-4c39-836c-19ab2bd898da\") " pod="calico-system/calico-typha-57cdbd6888-9m8lm" Sep 12 00:37:04.937956 containerd[1615]: time="2025-09-12T00:37:04.937673791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57cdbd6888-9m8lm,Uid:ae524881-d535-4c39-836c-19ab2bd898da,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:05.022187 containerd[1615]: time="2025-09-12T00:37:05.022116751Z" level=info msg="connecting to shim a9564cdfa497fa9ef54a3077a6c259384edc718cca94abe8cce9255262baea4e" address="unix:///run/containerd/s/7ed2facc4d6b9a75d2d5c6127b2cbb80d88680f69cbefabcd5bc96d0c09a4948" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:05.024118 systemd[1]: Created slice kubepods-besteffort-pod5af7aeb4_9c5a_42b0_a923_88296cb8e297.slice - libcontainer container kubepods-besteffort-pod5af7aeb4_9c5a_42b0_a923_88296cb8e297.slice. Sep 12 00:37:05.055475 systemd[1]: Started cri-containerd-a9564cdfa497fa9ef54a3077a6c259384edc718cca94abe8cce9255262baea4e.scope - libcontainer container a9564cdfa497fa9ef54a3077a6c259384edc718cca94abe8cce9255262baea4e. Sep 12 00:37:05.093956 kubelet[2932]: I0912 00:37:05.093916 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-flexvol-driver-host\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094083 kubelet[2932]: I0912 00:37:05.093967 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5af7aeb4-9c5a-42b0-a923-88296cb8e297-node-certs\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094083 kubelet[2932]: I0912 00:37:05.093989 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5af7aeb4-9c5a-42b0-a923-88296cb8e297-tigera-ca-bundle\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094083 kubelet[2932]: I0912 00:37:05.094003 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-var-run-calico\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094477 kubelet[2932]: I0912 00:37:05.094121 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-var-lib-calico\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094477 kubelet[2932]: I0912 00:37:05.094139 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-xtables-lock\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094477 kubelet[2932]: I0912 00:37:05.094150 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-cni-net-dir\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094477 kubelet[2932]: I0912 00:37:05.094187 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcds\" (UniqueName: \"kubernetes.io/projected/5af7aeb4-9c5a-42b0-a923-88296cb8e297-kube-api-access-fgcds\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094477 kubelet[2932]: I0912 00:37:05.094204 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-lib-modules\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094619 kubelet[2932]: I0912 00:37:05.094220 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-cni-log-dir\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094619 kubelet[2932]: I0912 00:37:05.094235 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-policysync\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.094619 kubelet[2932]: I0912 00:37:05.094260 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5af7aeb4-9c5a-42b0-a923-88296cb8e297-cni-bin-dir\") pod \"calico-node-rrxj8\" (UID: \"5af7aeb4-9c5a-42b0-a923-88296cb8e297\") " pod="calico-system/calico-node-rrxj8" Sep 12 00:37:05.125286 containerd[1615]: time="2025-09-12T00:37:05.125213761Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57cdbd6888-9m8lm,Uid:ae524881-d535-4c39-836c-19ab2bd898da,Namespace:calico-system,Attempt:0,} returns sandbox id \"a9564cdfa497fa9ef54a3077a6c259384edc718cca94abe8cce9255262baea4e\"" Sep 12 00:37:05.126895 containerd[1615]: time="2025-09-12T00:37:05.126868881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 00:37:05.327658 containerd[1615]: time="2025-09-12T00:37:05.327635570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rrxj8,Uid:5af7aeb4-9c5a-42b0-a923-88296cb8e297,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:05.370586 kubelet[2932]: E0912 00:37:05.370543 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4lfjf" podUID="3a3a008e-e1fb-47cc-bd0a-8ff12488e165" Sep 12 00:37:05.385506 kubelet[2932]: E0912 00:37:05.385475 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.385506 kubelet[2932]: W0912 00:37:05.385491 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.413577 kubelet[2932]: E0912 00:37:05.413542 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.413887 kubelet[2932]: E0912 00:37:05.413851 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.413887 kubelet[2932]: W0912 00:37:05.413862 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.414007 kubelet[2932]: E0912 00:37:05.413874 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.414149 kubelet[2932]: E0912 00:37:05.414069 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.414149 kubelet[2932]: W0912 00:37:05.414074 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.414149 kubelet[2932]: E0912 00:37:05.414082 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.414349 kubelet[2932]: E0912 00:37:05.414325 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.414349 kubelet[2932]: W0912 00:37:05.414332 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.414349 kubelet[2932]: E0912 00:37:05.414337 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.414648 kubelet[2932]: E0912 00:37:05.414610 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.414648 kubelet[2932]: W0912 00:37:05.414618 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.414648 kubelet[2932]: E0912 00:37:05.414623 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.414841 kubelet[2932]: E0912 00:37:05.414811 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.414841 kubelet[2932]: W0912 00:37:05.414818 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.414841 kubelet[2932]: E0912 00:37:05.414823 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.415015 kubelet[2932]: E0912 00:37:05.414985 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.415015 kubelet[2932]: W0912 00:37:05.414991 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.415015 kubelet[2932]: E0912 00:37:05.414996 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.415242 kubelet[2932]: E0912 00:37:05.415186 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.415242 kubelet[2932]: W0912 00:37:05.415193 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.415242 kubelet[2932]: E0912 00:37:05.415198 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.415421 kubelet[2932]: E0912 00:37:05.415389 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.415421 kubelet[2932]: W0912 00:37:05.415396 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.415421 kubelet[2932]: E0912 00:37:05.415401 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.415558 kubelet[2932]: E0912 00:37:05.415551 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.415611 kubelet[2932]: W0912 00:37:05.415586 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.415611 kubelet[2932]: E0912 00:37:05.415594 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.415754 kubelet[2932]: E0912 00:37:05.415720 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.415754 kubelet[2932]: W0912 00:37:05.415725 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.415754 kubelet[2932]: E0912 00:37:05.415730 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.415894 kubelet[2932]: E0912 00:37:05.415889 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.415957 kubelet[2932]: W0912 00:37:05.415921 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.415957 kubelet[2932]: E0912 00:37:05.415928 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.416119 kubelet[2932]: E0912 00:37:05.416085 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.416119 kubelet[2932]: W0912 00:37:05.416092 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.416119 kubelet[2932]: E0912 00:37:05.416099 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.416475 kubelet[2932]: E0912 00:37:05.416427 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.416475 kubelet[2932]: W0912 00:37:05.416438 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.416475 kubelet[2932]: E0912 00:37:05.416446 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.416682 kubelet[2932]: E0912 00:37:05.416648 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.416682 kubelet[2932]: W0912 00:37:05.416654 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.416682 kubelet[2932]: E0912 00:37:05.416660 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.416857 kubelet[2932]: E0912 00:37:05.416815 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.416857 kubelet[2932]: W0912 00:37:05.416821 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.416857 kubelet[2932]: E0912 00:37:05.416826 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427454 kubelet[2932]: E0912 00:37:05.417030 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427454 kubelet[2932]: W0912 00:37:05.417035 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.427454 kubelet[2932]: E0912 00:37:05.417040 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427454 kubelet[2932]: E0912 00:37:05.417148 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427454 kubelet[2932]: W0912 00:37:05.417153 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.427454 kubelet[2932]: E0912 00:37:05.417158 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427454 kubelet[2932]: E0912 00:37:05.417243 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427454 kubelet[2932]: W0912 00:37:05.417260 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.427454 kubelet[2932]: E0912 00:37:05.417266 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427454 kubelet[2932]: E0912 00:37:05.417358 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427699 kubelet[2932]: W0912 00:37:05.417362 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.427699 kubelet[2932]: E0912 00:37:05.417367 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427699 kubelet[2932]: E0912 00:37:05.417497 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427699 kubelet[2932]: W0912 00:37:05.417501 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.427699 kubelet[2932]: E0912 00:37:05.417506 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427699 kubelet[2932]: I0912 00:37:05.417523 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a3a008e-e1fb-47cc-bd0a-8ff12488e165-registration-dir\") pod \"csi-node-driver-4lfjf\" (UID: \"3a3a008e-e1fb-47cc-bd0a-8ff12488e165\") " pod="calico-system/csi-node-driver-4lfjf" Sep 12 00:37:05.427699 kubelet[2932]: E0912 00:37:05.417604 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427699 kubelet[2932]: W0912 00:37:05.417609 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.427699 kubelet[2932]: E0912 00:37:05.417613 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427936 kubelet[2932]: I0912 00:37:05.417626 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a3a008e-e1fb-47cc-bd0a-8ff12488e165-socket-dir\") pod \"csi-node-driver-4lfjf\" (UID: \"3a3a008e-e1fb-47cc-bd0a-8ff12488e165\") " pod="calico-system/csi-node-driver-4lfjf" Sep 12 00:37:05.427936 kubelet[2932]: E0912 00:37:05.417731 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427936 kubelet[2932]: W0912 00:37:05.417738 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.427936 kubelet[2932]: E0912 00:37:05.417744 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427936 kubelet[2932]: E0912 00:37:05.417833 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427936 kubelet[2932]: W0912 00:37:05.417838 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.427936 kubelet[2932]: E0912 00:37:05.417843 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.427936 kubelet[2932]: E0912 00:37:05.417942 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.427936 kubelet[2932]: W0912 00:37:05.417947 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428181 kubelet[2932]: E0912 00:37:05.417952 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428181 kubelet[2932]: I0912 00:37:05.417966 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g7sl\" (UniqueName: \"kubernetes.io/projected/3a3a008e-e1fb-47cc-bd0a-8ff12488e165-kube-api-access-7g7sl\") pod \"csi-node-driver-4lfjf\" (UID: \"3a3a008e-e1fb-47cc-bd0a-8ff12488e165\") " pod="calico-system/csi-node-driver-4lfjf" Sep 12 00:37:05.428181 kubelet[2932]: E0912 00:37:05.418049 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428181 kubelet[2932]: W0912 00:37:05.418055 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428181 kubelet[2932]: E0912 00:37:05.418060 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428181 kubelet[2932]: E0912 00:37:05.418286 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428181 kubelet[2932]: W0912 00:37:05.418290 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428181 kubelet[2932]: E0912 00:37:05.418297 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428181 kubelet[2932]: E0912 00:37:05.418405 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428446 kubelet[2932]: W0912 00:37:05.418410 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428446 kubelet[2932]: E0912 00:37:05.418414 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428446 kubelet[2932]: I0912 00:37:05.418428 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a3a008e-e1fb-47cc-bd0a-8ff12488e165-kubelet-dir\") pod \"csi-node-driver-4lfjf\" (UID: \"3a3a008e-e1fb-47cc-bd0a-8ff12488e165\") " pod="calico-system/csi-node-driver-4lfjf" Sep 12 00:37:05.428446 kubelet[2932]: E0912 00:37:05.418519 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428446 kubelet[2932]: W0912 00:37:05.418526 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428446 kubelet[2932]: E0912 00:37:05.418532 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428446 kubelet[2932]: E0912 00:37:05.418607 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428446 kubelet[2932]: W0912 00:37:05.418611 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428446 kubelet[2932]: E0912 00:37:05.418616 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428681 kubelet[2932]: E0912 00:37:05.418687 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428681 kubelet[2932]: W0912 00:37:05.418691 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428681 kubelet[2932]: E0912 00:37:05.418696 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428681 kubelet[2932]: I0912 00:37:05.418708 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3a3a008e-e1fb-47cc-bd0a-8ff12488e165-varrun\") pod \"csi-node-driver-4lfjf\" (UID: \"3a3a008e-e1fb-47cc-bd0a-8ff12488e165\") " pod="calico-system/csi-node-driver-4lfjf" Sep 12 00:37:05.428681 kubelet[2932]: E0912 00:37:05.418877 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428681 kubelet[2932]: W0912 00:37:05.418883 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428681 kubelet[2932]: E0912 00:37:05.418889 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428681 kubelet[2932]: E0912 00:37:05.419133 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428681 kubelet[2932]: W0912 00:37:05.419138 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428948 kubelet[2932]: E0912 00:37:05.419144 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428948 kubelet[2932]: E0912 00:37:05.419356 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428948 kubelet[2932]: W0912 00:37:05.419361 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428948 kubelet[2932]: E0912 00:37:05.419366 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.428948 kubelet[2932]: E0912 00:37:05.419448 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.428948 kubelet[2932]: W0912 00:37:05.419452 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.428948 kubelet[2932]: E0912 00:37:05.419457 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.479020 containerd[1615]: time="2025-09-12T00:37:05.478954182Z" level=info msg="connecting to shim 8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7" address="unix:///run/containerd/s/eff254af7a4e19a209bab82ce44e48ee2125ebe104732493a462a99cec16ffca" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:05.509479 systemd[1]: Started cri-containerd-8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7.scope - libcontainer container 8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7. Sep 12 00:37:05.520127 kubelet[2932]: E0912 00:37:05.520065 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.520127 kubelet[2932]: W0912 00:37:05.520082 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.520127 kubelet[2932]: E0912 00:37:05.520099 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.520660 kubelet[2932]: E0912 00:37:05.520650 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.520805 kubelet[2932]: W0912 00:37:05.520730 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.520805 kubelet[2932]: E0912 00:37:05.520742 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.520959 kubelet[2932]: E0912 00:37:05.520875 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.520959 kubelet[2932]: W0912 00:37:05.520887 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.520959 kubelet[2932]: E0912 00:37:05.520895 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.521089 kubelet[2932]: E0912 00:37:05.521055 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.521089 kubelet[2932]: W0912 00:37:05.521063 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.521089 kubelet[2932]: E0912 00:37:05.521070 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.521238 kubelet[2932]: E0912 00:37:05.521201 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.521238 kubelet[2932]: W0912 00:37:05.521210 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.521238 kubelet[2932]: E0912 00:37:05.521217 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.521581 kubelet[2932]: E0912 00:37:05.521468 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.521581 kubelet[2932]: W0912 00:37:05.521476 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.521581 kubelet[2932]: E0912 00:37:05.521486 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.521667 kubelet[2932]: E0912 00:37:05.521643 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.521667 kubelet[2932]: W0912 00:37:05.521650 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.521667 kubelet[2932]: E0912 00:37:05.521657 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.521981 kubelet[2932]: E0912 00:37:05.521770 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.521981 kubelet[2932]: W0912 00:37:05.521778 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.521981 kubelet[2932]: E0912 00:37:05.521785 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.521981 kubelet[2932]: E0912 00:37:05.521929 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.521981 kubelet[2932]: W0912 00:37:05.521937 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.521981 kubelet[2932]: E0912 00:37:05.521944 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.522331 kubelet[2932]: E0912 00:37:05.522289 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.522331 kubelet[2932]: W0912 00:37:05.522299 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.522331 kubelet[2932]: E0912 00:37:05.522306 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.522527 kubelet[2932]: E0912 00:37:05.522500 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.522527 kubelet[2932]: W0912 00:37:05.522510 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.522527 kubelet[2932]: E0912 00:37:05.522519 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.522787 kubelet[2932]: E0912 00:37:05.522780 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.522886 kubelet[2932]: W0912 00:37:05.522830 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.522886 kubelet[2932]: E0912 00:37:05.522841 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.523091 kubelet[2932]: E0912 00:37:05.523049 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.523091 kubelet[2932]: W0912 00:37:05.523057 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.523091 kubelet[2932]: E0912 00:37:05.523064 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.523339 kubelet[2932]: E0912 00:37:05.523291 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.523339 kubelet[2932]: W0912 00:37:05.523299 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.523339 kubelet[2932]: E0912 00:37:05.523306 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.523477 kubelet[2932]: E0912 00:37:05.523470 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.523736 kubelet[2932]: W0912 00:37:05.523631 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.523736 kubelet[2932]: E0912 00:37:05.523641 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.524076 kubelet[2932]: E0912 00:37:05.524008 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.524076 kubelet[2932]: W0912 00:37:05.524017 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.524076 kubelet[2932]: E0912 00:37:05.524027 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.524187 kubelet[2932]: E0912 00:37:05.524180 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.524308 kubelet[2932]: W0912 00:37:05.524228 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.524308 kubelet[2932]: E0912 00:37:05.524252 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.524539 kubelet[2932]: E0912 00:37:05.524532 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.524627 kubelet[2932]: W0912 00:37:05.524579 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.524627 kubelet[2932]: E0912 00:37:05.524588 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.524897 kubelet[2932]: E0912 00:37:05.524888 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.525031 kubelet[2932]: W0912 00:37:05.524941 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.525031 kubelet[2932]: E0912 00:37:05.524952 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.525381 kubelet[2932]: E0912 00:37:05.525348 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.525381 kubelet[2932]: W0912 00:37:05.525357 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.525381 kubelet[2932]: E0912 00:37:05.525366 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.525808 kubelet[2932]: E0912 00:37:05.525773 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.525808 kubelet[2932]: W0912 00:37:05.525783 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.525808 kubelet[2932]: E0912 00:37:05.525792 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.527837 kubelet[2932]: E0912 00:37:05.527721 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.527837 kubelet[2932]: W0912 00:37:05.527733 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.527837 kubelet[2932]: E0912 00:37:05.527743 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.528153 kubelet[2932]: E0912 00:37:05.527986 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.528153 kubelet[2932]: W0912 00:37:05.527995 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.528153 kubelet[2932]: E0912 00:37:05.528002 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.528718 kubelet[2932]: E0912 00:37:05.528512 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.528718 kubelet[2932]: W0912 00:37:05.528523 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.528718 kubelet[2932]: E0912 00:37:05.528530 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.528889 kubelet[2932]: E0912 00:37:05.528856 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.528889 kubelet[2932]: W0912 00:37:05.528865 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.528889 kubelet[2932]: E0912 00:37:05.528875 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.537546 kubelet[2932]: E0912 00:37:05.537461 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:05.537546 kubelet[2932]: W0912 00:37:05.537506 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:05.537546 kubelet[2932]: E0912 00:37:05.537520 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:05.544169 containerd[1615]: time="2025-09-12T00:37:05.544144012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rrxj8,Uid:5af7aeb4-9c5a-42b0-a923-88296cb8e297,Namespace:calico-system,Attempt:0,} returns sandbox id \"8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7\"" Sep 12 00:37:06.809487 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3966523146.mount: Deactivated successfully. Sep 12 00:37:07.082155 kubelet[2932]: E0912 00:37:07.081918 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4lfjf" podUID="3a3a008e-e1fb-47cc-bd0a-8ff12488e165" Sep 12 00:37:07.747266 containerd[1615]: time="2025-09-12T00:37:07.746981236Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:07.757649 containerd[1615]: time="2025-09-12T00:37:07.757442272Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 00:37:07.763715 containerd[1615]: time="2025-09-12T00:37:07.763673078Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:07.776325 containerd[1615]: time="2025-09-12T00:37:07.776235050Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:07.776882 containerd[1615]: time="2025-09-12T00:37:07.776739741Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.649843664s" Sep 12 00:37:07.776882 containerd[1615]: time="2025-09-12T00:37:07.776761773Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 00:37:07.777700 containerd[1615]: time="2025-09-12T00:37:07.777660570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 00:37:07.821124 containerd[1615]: time="2025-09-12T00:37:07.821086229Z" level=info msg="CreateContainer within sandbox \"a9564cdfa497fa9ef54a3077a6c259384edc718cca94abe8cce9255262baea4e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 00:37:07.881094 containerd[1615]: time="2025-09-12T00:37:07.880375853Z" level=info msg="Container 31ef1710b252576da7e56c334141aaca4e3f276d12c6a801ef395309979ec69e: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:07.948534 containerd[1615]: time="2025-09-12T00:37:07.948491193Z" level=info msg="CreateContainer within sandbox \"a9564cdfa497fa9ef54a3077a6c259384edc718cca94abe8cce9255262baea4e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"31ef1710b252576da7e56c334141aaca4e3f276d12c6a801ef395309979ec69e\"" Sep 12 00:37:07.949163 containerd[1615]: time="2025-09-12T00:37:07.948884396Z" level=info msg="StartContainer for \"31ef1710b252576da7e56c334141aaca4e3f276d12c6a801ef395309979ec69e\"" Sep 12 00:37:07.950631 containerd[1615]: time="2025-09-12T00:37:07.950601128Z" level=info msg="connecting to shim 31ef1710b252576da7e56c334141aaca4e3f276d12c6a801ef395309979ec69e" address="unix:///run/containerd/s/7ed2facc4d6b9a75d2d5c6127b2cbb80d88680f69cbefabcd5bc96d0c09a4948" protocol=ttrpc version=3 Sep 12 00:37:08.075416 systemd[1]: Started cri-containerd-31ef1710b252576da7e56c334141aaca4e3f276d12c6a801ef395309979ec69e.scope - libcontainer container 31ef1710b252576da7e56c334141aaca4e3f276d12c6a801ef395309979ec69e. Sep 12 00:37:08.151590 containerd[1615]: time="2025-09-12T00:37:08.151555480Z" level=info msg="StartContainer for \"31ef1710b252576da7e56c334141aaca4e3f276d12c6a801ef395309979ec69e\" returns successfully" Sep 12 00:37:08.270261 kubelet[2932]: E0912 00:37:08.270219 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.270261 kubelet[2932]: W0912 00:37:08.270241 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.288331 kubelet[2932]: E0912 00:37:08.288295 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.288565 kubelet[2932]: E0912 00:37:08.288550 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.288606 kubelet[2932]: W0912 00:37:08.288564 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.288606 kubelet[2932]: E0912 00:37:08.288581 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.288701 kubelet[2932]: E0912 00:37:08.288681 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.288701 kubelet[2932]: W0912 00:37:08.288686 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.288701 kubelet[2932]: E0912 00:37:08.288691 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.303882 kubelet[2932]: E0912 00:37:08.303855 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.303882 kubelet[2932]: W0912 00:37:08.303873 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.304009 kubelet[2932]: E0912 00:37:08.303891 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.304354 kubelet[2932]: E0912 00:37:08.304338 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.304354 kubelet[2932]: W0912 00:37:08.304348 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.304354 kubelet[2932]: E0912 00:37:08.304355 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.304457 kubelet[2932]: E0912 00:37:08.304440 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.304457 kubelet[2932]: W0912 00:37:08.304445 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.304457 kubelet[2932]: E0912 00:37:08.304450 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.304552 kubelet[2932]: E0912 00:37:08.304529 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.304552 kubelet[2932]: W0912 00:37:08.304533 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.304552 kubelet[2932]: E0912 00:37:08.304538 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.304632 kubelet[2932]: E0912 00:37:08.304613 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.304632 kubelet[2932]: W0912 00:37:08.304618 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.304632 kubelet[2932]: E0912 00:37:08.304622 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.304739 kubelet[2932]: E0912 00:37:08.304728 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.304739 kubelet[2932]: W0912 00:37:08.304733 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.304739 kubelet[2932]: E0912 00:37:08.304737 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.305610 kubelet[2932]: E0912 00:37:08.305586 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.305610 kubelet[2932]: W0912 00:37:08.305605 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.305665 kubelet[2932]: E0912 00:37:08.305616 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.305742 kubelet[2932]: E0912 00:37:08.305731 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.305742 kubelet[2932]: W0912 00:37:08.305740 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.305784 kubelet[2932]: E0912 00:37:08.305746 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.305875 kubelet[2932]: E0912 00:37:08.305863 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.305875 kubelet[2932]: W0912 00:37:08.305871 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.305918 kubelet[2932]: E0912 00:37:08.305877 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.306018 kubelet[2932]: E0912 00:37:08.306004 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.306018 kubelet[2932]: W0912 00:37:08.306014 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.306109 kubelet[2932]: E0912 00:37:08.306020 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.306136 kubelet[2932]: E0912 00:37:08.306118 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.306136 kubelet[2932]: W0912 00:37:08.306123 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.306136 kubelet[2932]: E0912 00:37:08.306128 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.306222 kubelet[2932]: E0912 00:37:08.306214 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.306222 kubelet[2932]: W0912 00:37:08.306218 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.306282 kubelet[2932]: E0912 00:37:08.306225 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.306390 kubelet[2932]: E0912 00:37:08.306378 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.306390 kubelet[2932]: W0912 00:37:08.306387 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319134 kubelet[2932]: E0912 00:37:08.306393 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319134 kubelet[2932]: E0912 00:37:08.317022 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319134 kubelet[2932]: W0912 00:37:08.317035 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319134 kubelet[2932]: E0912 00:37:08.317049 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319134 kubelet[2932]: E0912 00:37:08.317241 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319134 kubelet[2932]: W0912 00:37:08.317260 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319134 kubelet[2932]: E0912 00:37:08.317272 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319134 kubelet[2932]: E0912 00:37:08.318303 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319134 kubelet[2932]: W0912 00:37:08.318318 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319134 kubelet[2932]: E0912 00:37:08.318335 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319661 kubelet[2932]: E0912 00:37:08.318527 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319661 kubelet[2932]: W0912 00:37:08.318536 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319661 kubelet[2932]: E0912 00:37:08.318546 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319661 kubelet[2932]: E0912 00:37:08.318691 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319661 kubelet[2932]: W0912 00:37:08.318696 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319661 kubelet[2932]: E0912 00:37:08.318703 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319661 kubelet[2932]: E0912 00:37:08.318808 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319661 kubelet[2932]: W0912 00:37:08.318813 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319661 kubelet[2932]: E0912 00:37:08.318817 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319661 kubelet[2932]: E0912 00:37:08.318916 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319921 kubelet[2932]: W0912 00:37:08.318921 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319921 kubelet[2932]: E0912 00:37:08.318926 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319921 kubelet[2932]: E0912 00:37:08.319052 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319921 kubelet[2932]: W0912 00:37:08.319057 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319921 kubelet[2932]: E0912 00:37:08.319062 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319921 kubelet[2932]: E0912 00:37:08.319213 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319921 kubelet[2932]: W0912 00:37:08.319219 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.319921 kubelet[2932]: E0912 00:37:08.319224 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.319921 kubelet[2932]: E0912 00:37:08.319423 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.319921 kubelet[2932]: W0912 00:37:08.319430 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.320136 kubelet[2932]: E0912 00:37:08.319436 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.320136 kubelet[2932]: E0912 00:37:08.319578 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.320136 kubelet[2932]: W0912 00:37:08.319583 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.320136 kubelet[2932]: E0912 00:37:08.319589 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.320136 kubelet[2932]: E0912 00:37:08.319746 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.320136 kubelet[2932]: W0912 00:37:08.319752 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.320136 kubelet[2932]: E0912 00:37:08.319758 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.320136 kubelet[2932]: E0912 00:37:08.319930 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.320136 kubelet[2932]: W0912 00:37:08.319937 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.320136 kubelet[2932]: E0912 00:37:08.319945 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.321452 kubelet[2932]: E0912 00:37:08.320292 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.321452 kubelet[2932]: W0912 00:37:08.320298 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.321452 kubelet[2932]: E0912 00:37:08.320305 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.321452 kubelet[2932]: E0912 00:37:08.320433 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.321452 kubelet[2932]: W0912 00:37:08.320440 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.321452 kubelet[2932]: E0912 00:37:08.320446 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.321452 kubelet[2932]: E0912 00:37:08.321389 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.321452 kubelet[2932]: W0912 00:37:08.321398 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.321452 kubelet[2932]: E0912 00:37:08.321410 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:08.322107 kubelet[2932]: E0912 00:37:08.321766 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:08.322107 kubelet[2932]: W0912 00:37:08.321776 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:08.322107 kubelet[2932]: E0912 00:37:08.321786 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.071189 kubelet[2932]: E0912 00:37:09.071135 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4lfjf" podUID="3a3a008e-e1fb-47cc-bd0a-8ff12488e165" Sep 12 00:37:09.185749 kubelet[2932]: I0912 00:37:09.185315 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:37:09.210800 kubelet[2932]: E0912 00:37:09.210765 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.210800 kubelet[2932]: W0912 00:37:09.210801 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.210938 kubelet[2932]: E0912 00:37:09.210819 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.211032 kubelet[2932]: E0912 00:37:09.210999 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.211032 kubelet[2932]: W0912 00:37:09.211030 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.211088 kubelet[2932]: E0912 00:37:09.211038 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.211202 kubelet[2932]: E0912 00:37:09.211186 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.211234 kubelet[2932]: W0912 00:37:09.211205 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.211234 kubelet[2932]: E0912 00:37:09.211214 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.211414 kubelet[2932]: E0912 00:37:09.211401 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.211451 kubelet[2932]: W0912 00:37:09.211424 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.211476 kubelet[2932]: E0912 00:37:09.211461 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.211747 kubelet[2932]: E0912 00:37:09.211736 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.211747 kubelet[2932]: W0912 00:37:09.211743 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.211817 kubelet[2932]: E0912 00:37:09.211751 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.212125 kubelet[2932]: E0912 00:37:09.211928 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.212125 kubelet[2932]: W0912 00:37:09.211935 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.212125 kubelet[2932]: E0912 00:37:09.211941 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.212125 kubelet[2932]: E0912 00:37:09.212081 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.212125 kubelet[2932]: W0912 00:37:09.212120 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.212125 kubelet[2932]: E0912 00:37:09.212128 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.212541 kubelet[2932]: E0912 00:37:09.212220 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.212541 kubelet[2932]: W0912 00:37:09.212225 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.212541 kubelet[2932]: E0912 00:37:09.212230 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.212541 kubelet[2932]: E0912 00:37:09.212341 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.212541 kubelet[2932]: W0912 00:37:09.212347 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.212541 kubelet[2932]: E0912 00:37:09.212355 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.212541 kubelet[2932]: E0912 00:37:09.212459 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.212541 kubelet[2932]: W0912 00:37:09.212466 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.212541 kubelet[2932]: E0912 00:37:09.212474 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.218570 kubelet[2932]: E0912 00:37:09.212582 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.218570 kubelet[2932]: W0912 00:37:09.212598 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.218570 kubelet[2932]: E0912 00:37:09.212604 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.218570 kubelet[2932]: E0912 00:37:09.212684 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.218570 kubelet[2932]: W0912 00:37:09.212689 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.218570 kubelet[2932]: E0912 00:37:09.212696 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.218570 kubelet[2932]: E0912 00:37:09.212787 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.218570 kubelet[2932]: W0912 00:37:09.212793 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.218570 kubelet[2932]: E0912 00:37:09.212797 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.218570 kubelet[2932]: E0912 00:37:09.212866 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.218928 kubelet[2932]: W0912 00:37:09.212870 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.218928 kubelet[2932]: E0912 00:37:09.212875 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.218928 kubelet[2932]: E0912 00:37:09.212953 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.218928 kubelet[2932]: W0912 00:37:09.212957 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.218928 kubelet[2932]: E0912 00:37:09.212962 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.223685 kubelet[2932]: E0912 00:37:09.223601 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.223685 kubelet[2932]: W0912 00:37:09.223615 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.223685 kubelet[2932]: E0912 00:37:09.223633 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.223990 kubelet[2932]: E0912 00:37:09.223939 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.223990 kubelet[2932]: W0912 00:37:09.223949 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.223990 kubelet[2932]: E0912 00:37:09.223957 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.224134 kubelet[2932]: E0912 00:37:09.224072 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.224134 kubelet[2932]: W0912 00:37:09.224082 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.224134 kubelet[2932]: E0912 00:37:09.224092 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.224401 kubelet[2932]: E0912 00:37:09.224191 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.224401 kubelet[2932]: W0912 00:37:09.224196 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.224401 kubelet[2932]: E0912 00:37:09.224202 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.224401 kubelet[2932]: E0912 00:37:09.224299 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.224401 kubelet[2932]: W0912 00:37:09.224305 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.224401 kubelet[2932]: E0912 00:37:09.224313 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.224401 kubelet[2932]: E0912 00:37:09.224402 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.224401 kubelet[2932]: W0912 00:37:09.224406 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.224401 kubelet[2932]: E0912 00:37:09.224412 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.224902 kubelet[2932]: E0912 00:37:09.224786 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.224902 kubelet[2932]: W0912 00:37:09.224796 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.224902 kubelet[2932]: E0912 00:37:09.224806 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.225013 kubelet[2932]: E0912 00:37:09.225006 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.225059 kubelet[2932]: W0912 00:37:09.225052 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.225123 kubelet[2932]: E0912 00:37:09.225089 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.225256 kubelet[2932]: E0912 00:37:09.225235 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.225256 kubelet[2932]: W0912 00:37:09.225242 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.225471 kubelet[2932]: E0912 00:37:09.225354 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.225556 kubelet[2932]: E0912 00:37:09.225549 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.225713 kubelet[2932]: W0912 00:37:09.225629 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.225713 kubelet[2932]: E0912 00:37:09.225642 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.225803 kubelet[2932]: E0912 00:37:09.225797 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.225910 kubelet[2932]: W0912 00:37:09.225859 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.225910 kubelet[2932]: E0912 00:37:09.225869 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.226055 kubelet[2932]: E0912 00:37:09.226049 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.226144 kubelet[2932]: W0912 00:37:09.226093 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.226144 kubelet[2932]: E0912 00:37:09.226105 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.226313 kubelet[2932]: E0912 00:37:09.226305 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.226519 kubelet[2932]: W0912 00:37:09.226355 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.226519 kubelet[2932]: E0912 00:37:09.226366 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.226609 kubelet[2932]: E0912 00:37:09.226539 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.226609 kubelet[2932]: W0912 00:37:09.226547 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.226609 kubelet[2932]: E0912 00:37:09.226555 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.226685 kubelet[2932]: E0912 00:37:09.226641 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.226685 kubelet[2932]: W0912 00:37:09.226648 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.226685 kubelet[2932]: E0912 00:37:09.226655 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.226820 kubelet[2932]: E0912 00:37:09.226806 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.226820 kubelet[2932]: W0912 00:37:09.226814 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.226878 kubelet[2932]: E0912 00:37:09.226821 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.227056 kubelet[2932]: E0912 00:37:09.227004 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.227056 kubelet[2932]: W0912 00:37:09.227012 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.227056 kubelet[2932]: E0912 00:37:09.227018 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.227262 kubelet[2932]: E0912 00:37:09.227230 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:37:09.227262 kubelet[2932]: W0912 00:37:09.227240 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:37:09.227367 kubelet[2932]: E0912 00:37:09.227349 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:37:09.524268 containerd[1615]: time="2025-09-12T00:37:09.523943158Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:09.529424 containerd[1615]: time="2025-09-12T00:37:09.529360516Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 00:37:09.534428 containerd[1615]: time="2025-09-12T00:37:09.534400089Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:09.537633 containerd[1615]: time="2025-09-12T00:37:09.537592154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:09.538120 containerd[1615]: time="2025-09-12T00:37:09.537958918Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.760280884s" Sep 12 00:37:09.538120 containerd[1615]: time="2025-09-12T00:37:09.537981631Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 00:37:09.542592 containerd[1615]: time="2025-09-12T00:37:09.542559236Z" level=info msg="CreateContainer within sandbox \"8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 00:37:09.554007 containerd[1615]: time="2025-09-12T00:37:09.550283745Z" level=info msg="Container ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:09.554293 containerd[1615]: time="2025-09-12T00:37:09.554271814Z" level=info msg="CreateContainer within sandbox \"8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83\"" Sep 12 00:37:09.555904 containerd[1615]: time="2025-09-12T00:37:09.554683459Z" level=info msg="StartContainer for \"ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83\"" Sep 12 00:37:09.557076 containerd[1615]: time="2025-09-12T00:37:09.557043283Z" level=info msg="connecting to shim ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83" address="unix:///run/containerd/s/eff254af7a4e19a209bab82ce44e48ee2125ebe104732493a462a99cec16ffca" protocol=ttrpc version=3 Sep 12 00:37:09.581434 systemd[1]: Started cri-containerd-ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83.scope - libcontainer container ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83. Sep 12 00:37:09.663182 systemd[1]: cri-containerd-ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83.scope: Deactivated successfully. Sep 12 00:37:09.676431 containerd[1615]: time="2025-09-12T00:37:09.676400515Z" level=info msg="StartContainer for \"ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83\" returns successfully" Sep 12 00:37:09.686736 containerd[1615]: time="2025-09-12T00:37:09.686018034Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83\" id:\"ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83\" pid:3635 exited_at:{seconds:1757637429 nanos:666774161}" Sep 12 00:37:09.704054 containerd[1615]: time="2025-09-12T00:37:09.704027301Z" level=info msg="received exit event container_id:\"ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83\" id:\"ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83\" pid:3635 exited_at:{seconds:1757637429 nanos:666774161}" Sep 12 00:37:09.729930 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ae4e111829cef9f5368f890c1b8aeb9e824c9f69838275994ec9b09c2d182e83-rootfs.mount: Deactivated successfully. Sep 12 00:37:10.196977 kubelet[2932]: I0912 00:37:10.196773 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-57cdbd6888-9m8lm" podStartSLOduration=3.545716081 podStartE2EDuration="6.19675723s" podCreationTimestamp="2025-09-12 00:37:04 +0000 UTC" firstStartedPulling="2025-09-12 00:37:05.126514031 +0000 UTC m=+20.157936244" lastFinishedPulling="2025-09-12 00:37:07.777555186 +0000 UTC m=+22.808977393" observedRunningTime="2025-09-12 00:37:08.216699752 +0000 UTC m=+23.248121973" watchObservedRunningTime="2025-09-12 00:37:10.19675723 +0000 UTC m=+25.228179449" Sep 12 00:37:11.071269 kubelet[2932]: E0912 00:37:11.071080 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4lfjf" podUID="3a3a008e-e1fb-47cc-bd0a-8ff12488e165" Sep 12 00:37:11.183691 containerd[1615]: time="2025-09-12T00:37:11.183310676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 00:37:13.071424 kubelet[2932]: E0912 00:37:13.070527 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4lfjf" podUID="3a3a008e-e1fb-47cc-bd0a-8ff12488e165" Sep 12 00:37:14.756084 containerd[1615]: time="2025-09-12T00:37:14.755501042Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:14.756514 containerd[1615]: time="2025-09-12T00:37:14.756500593Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 00:37:14.757156 containerd[1615]: time="2025-09-12T00:37:14.757139230Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:14.758429 containerd[1615]: time="2025-09-12T00:37:14.758408615Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:14.758996 containerd[1615]: time="2025-09-12T00:37:14.758979099Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.57557442s" Sep 12 00:37:14.759062 containerd[1615]: time="2025-09-12T00:37:14.759051901Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 00:37:14.777873 containerd[1615]: time="2025-09-12T00:37:14.777823392Z" level=info msg="CreateContainer within sandbox \"8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 00:37:14.792094 containerd[1615]: time="2025-09-12T00:37:14.791393749Z" level=info msg="Container 00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:14.797702 containerd[1615]: time="2025-09-12T00:37:14.797680737Z" level=info msg="CreateContainer within sandbox \"8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250\"" Sep 12 00:37:14.798324 containerd[1615]: time="2025-09-12T00:37:14.798274152Z" level=info msg="StartContainer for \"00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250\"" Sep 12 00:37:14.799870 containerd[1615]: time="2025-09-12T00:37:14.799844239Z" level=info msg="connecting to shim 00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250" address="unix:///run/containerd/s/eff254af7a4e19a209bab82ce44e48ee2125ebe104732493a462a99cec16ffca" protocol=ttrpc version=3 Sep 12 00:37:14.817382 systemd[1]: Started cri-containerd-00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250.scope - libcontainer container 00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250. Sep 12 00:37:14.864826 containerd[1615]: time="2025-09-12T00:37:14.864715130Z" level=info msg="StartContainer for \"00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250\" returns successfully" Sep 12 00:37:15.089025 kubelet[2932]: E0912 00:37:15.088709 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4lfjf" podUID="3a3a008e-e1fb-47cc-bd0a-8ff12488e165" Sep 12 00:37:16.821501 systemd[1]: cri-containerd-00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250.scope: Deactivated successfully. Sep 12 00:37:16.822312 systemd[1]: cri-containerd-00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250.scope: Consumed 327ms CPU time, 162.8M memory peak, 2M read from disk, 171.3M written to disk. Sep 12 00:37:16.907435 containerd[1615]: time="2025-09-12T00:37:16.907214827Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250\" id:\"00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250\" pid:3692 exited_at:{seconds:1757637436 nanos:901575401}" Sep 12 00:37:16.909292 kubelet[2932]: I0912 00:37:16.908392 2932 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 00:37:16.913716 containerd[1615]: time="2025-09-12T00:37:16.913610134Z" level=info msg="received exit event container_id:\"00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250\" id:\"00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250\" pid:3692 exited_at:{seconds:1757637436 nanos:901575401}" Sep 12 00:37:16.945137 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-00a97ffbe86b12cd1ab9be90fc53f7518c7f6c03e9e83074378c5b556cb9f250-rootfs.mount: Deactivated successfully. Sep 12 00:37:17.061979 systemd[1]: Created slice kubepods-burstable-poda27ee4d6_8a7c_4ee2_bfda_cc534bfa5d58.slice - libcontainer container kubepods-burstable-poda27ee4d6_8a7c_4ee2_bfda_cc534bfa5d58.slice. Sep 12 00:37:17.110088 systemd[1]: Created slice kubepods-besteffort-pod4ced77e4_35b4_46cb_87ba_9ce55f472d52.slice - libcontainer container kubepods-besteffort-pod4ced77e4_35b4_46cb_87ba_9ce55f472d52.slice. Sep 12 00:37:17.116475 systemd[1]: Created slice kubepods-burstable-pod72ef27f4_5e10_4386_b23d_7206e0e9d085.slice - libcontainer container kubepods-burstable-pod72ef27f4_5e10_4386_b23d_7206e0e9d085.slice. Sep 12 00:37:17.138849 containerd[1615]: time="2025-09-12T00:37:17.126531226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4lfjf,Uid:3a3a008e-e1fb-47cc-bd0a-8ff12488e165,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:17.120452 systemd[1]: Created slice kubepods-besteffort-pod3a3a008e_e1fb_47cc_bd0a_8ff12488e165.slice - libcontainer container kubepods-besteffort-pod3a3a008e_e1fb_47cc_bd0a_8ff12488e165.slice. Sep 12 00:37:17.133213 systemd[1]: Created slice kubepods-besteffort-pod79ef442e_2a69_41e8_b511_65438149e24e.slice - libcontainer container kubepods-besteffort-pod79ef442e_2a69_41e8_b511_65438149e24e.slice. Sep 12 00:37:17.218028 kubelet[2932]: I0912 00:37:17.201968 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4ced77e4-35b4-46cb-87ba-9ce55f472d52-whisker-backend-key-pair\") pod \"whisker-5f865bb855-x8l6w\" (UID: \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\") " pod="calico-system/whisker-5f865bb855-x8l6w" Sep 12 00:37:17.218028 kubelet[2932]: I0912 00:37:17.202009 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ced77e4-35b4-46cb-87ba-9ce55f472d52-whisker-ca-bundle\") pod \"whisker-5f865bb855-x8l6w\" (UID: \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\") " pod="calico-system/whisker-5f865bb855-x8l6w" Sep 12 00:37:17.218028 kubelet[2932]: I0912 00:37:17.202020 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72ef27f4-5e10-4386-b23d-7206e0e9d085-config-volume\") pod \"coredns-674b8bbfcf-tkh46\" (UID: \"72ef27f4-5e10-4386-b23d-7206e0e9d085\") " pod="kube-system/coredns-674b8bbfcf-tkh46" Sep 12 00:37:17.218028 kubelet[2932]: I0912 00:37:17.202031 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9cns\" (UniqueName: \"kubernetes.io/projected/79ef442e-2a69-41e8-b511-65438149e24e-kube-api-access-d9cns\") pod \"calico-kube-controllers-7585d978b7-fk9q8\" (UID: \"79ef442e-2a69-41e8-b511-65438149e24e\") " pod="calico-system/calico-kube-controllers-7585d978b7-fk9q8" Sep 12 00:37:17.218028 kubelet[2932]: I0912 00:37:17.202043 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8th\" (UniqueName: \"kubernetes.io/projected/a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58-kube-api-access-4w8th\") pod \"coredns-674b8bbfcf-mbnfj\" (UID: \"a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58\") " pod="kube-system/coredns-674b8bbfcf-mbnfj" Sep 12 00:37:17.223027 kubelet[2932]: I0912 00:37:17.202052 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hc2d\" (UniqueName: \"kubernetes.io/projected/4ced77e4-35b4-46cb-87ba-9ce55f472d52-kube-api-access-8hc2d\") pod \"whisker-5f865bb855-x8l6w\" (UID: \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\") " pod="calico-system/whisker-5f865bb855-x8l6w" Sep 12 00:37:17.223027 kubelet[2932]: I0912 00:37:17.202060 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jgm\" (UniqueName: \"kubernetes.io/projected/72ef27f4-5e10-4386-b23d-7206e0e9d085-kube-api-access-k2jgm\") pod \"coredns-674b8bbfcf-tkh46\" (UID: \"72ef27f4-5e10-4386-b23d-7206e0e9d085\") " pod="kube-system/coredns-674b8bbfcf-tkh46" Sep 12 00:37:17.223027 kubelet[2932]: I0912 00:37:17.202070 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58-config-volume\") pod \"coredns-674b8bbfcf-mbnfj\" (UID: \"a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58\") " pod="kube-system/coredns-674b8bbfcf-mbnfj" Sep 12 00:37:17.223027 kubelet[2932]: I0912 00:37:17.202080 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79ef442e-2a69-41e8-b511-65438149e24e-tigera-ca-bundle\") pod \"calico-kube-controllers-7585d978b7-fk9q8\" (UID: \"79ef442e-2a69-41e8-b511-65438149e24e\") " pod="calico-system/calico-kube-controllers-7585d978b7-fk9q8" Sep 12 00:37:17.222815 systemd[1]: Created slice kubepods-besteffort-poddc0e806a_7a27_438c_b07b_5454371c66d4.slice - libcontainer container kubepods-besteffort-poddc0e806a_7a27_438c_b07b_5454371c66d4.slice. Sep 12 00:37:17.250449 systemd[1]: Created slice kubepods-besteffort-pod9e9d3bf8_503a_4ac7_8753_ceaf23928a49.slice - libcontainer container kubepods-besteffort-pod9e9d3bf8_503a_4ac7_8753_ceaf23928a49.slice. Sep 12 00:37:17.256602 systemd[1]: Created slice kubepods-besteffort-pod738d0c22_ee9e_4e56_9fce_df1a40eb2b00.slice - libcontainer container kubepods-besteffort-pod738d0c22_ee9e_4e56_9fce_df1a40eb2b00.slice. Sep 12 00:37:17.260752 systemd[1]: Created slice kubepods-besteffort-pod5cff39d8_e5be_4f52_9e94_587e9399478a.slice - libcontainer container kubepods-besteffort-pod5cff39d8_e5be_4f52_9e94_587e9399478a.slice. Sep 12 00:37:17.302453 kubelet[2932]: I0912 00:37:17.302404 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvc4w\" (UniqueName: \"kubernetes.io/projected/9e9d3bf8-503a-4ac7-8753-ceaf23928a49-kube-api-access-mvc4w\") pod \"calico-apiserver-8665c66cb-hjgv7\" (UID: \"9e9d3bf8-503a-4ac7-8753-ceaf23928a49\") " pod="calico-apiserver/calico-apiserver-8665c66cb-hjgv7" Sep 12 00:37:17.302615 kubelet[2932]: I0912 00:37:17.302470 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9e9d3bf8-503a-4ac7-8753-ceaf23928a49-calico-apiserver-certs\") pod \"calico-apiserver-8665c66cb-hjgv7\" (UID: \"9e9d3bf8-503a-4ac7-8753-ceaf23928a49\") " pod="calico-apiserver/calico-apiserver-8665c66cb-hjgv7" Sep 12 00:37:17.302615 kubelet[2932]: I0912 00:37:17.302490 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dc0e806a-7a27-438c-b07b-5454371c66d4-calico-apiserver-certs\") pod \"calico-apiserver-797c9d4dc7-nwcb6\" (UID: \"dc0e806a-7a27-438c-b07b-5454371c66d4\") " pod="calico-apiserver/calico-apiserver-797c9d4dc7-nwcb6" Sep 12 00:37:17.302615 kubelet[2932]: I0912 00:37:17.302530 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5cff39d8-e5be-4f52-9e94-587e9399478a-calico-apiserver-certs\") pod \"calico-apiserver-797c9d4dc7-9jk4g\" (UID: \"5cff39d8-e5be-4f52-9e94-587e9399478a\") " pod="calico-apiserver/calico-apiserver-797c9d4dc7-9jk4g" Sep 12 00:37:17.302615 kubelet[2932]: I0912 00:37:17.302548 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738d0c22-ee9e-4e56-9fce-df1a40eb2b00-config\") pod \"goldmane-54d579b49d-bjqdk\" (UID: \"738d0c22-ee9e-4e56-9fce-df1a40eb2b00\") " pod="calico-system/goldmane-54d579b49d-bjqdk" Sep 12 00:37:17.302615 kubelet[2932]: I0912 00:37:17.302564 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/738d0c22-ee9e-4e56-9fce-df1a40eb2b00-goldmane-key-pair\") pod \"goldmane-54d579b49d-bjqdk\" (UID: \"738d0c22-ee9e-4e56-9fce-df1a40eb2b00\") " pod="calico-system/goldmane-54d579b49d-bjqdk" Sep 12 00:37:17.302742 kubelet[2932]: I0912 00:37:17.302606 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czglw\" (UniqueName: \"kubernetes.io/projected/dc0e806a-7a27-438c-b07b-5454371c66d4-kube-api-access-czglw\") pod \"calico-apiserver-797c9d4dc7-nwcb6\" (UID: \"dc0e806a-7a27-438c-b07b-5454371c66d4\") " pod="calico-apiserver/calico-apiserver-797c9d4dc7-nwcb6" Sep 12 00:37:17.302742 kubelet[2932]: I0912 00:37:17.302633 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738d0c22-ee9e-4e56-9fce-df1a40eb2b00-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-bjqdk\" (UID: \"738d0c22-ee9e-4e56-9fce-df1a40eb2b00\") " pod="calico-system/goldmane-54d579b49d-bjqdk" Sep 12 00:37:17.302742 kubelet[2932]: I0912 00:37:17.302662 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wzjl\" (UniqueName: \"kubernetes.io/projected/738d0c22-ee9e-4e56-9fce-df1a40eb2b00-kube-api-access-6wzjl\") pod \"goldmane-54d579b49d-bjqdk\" (UID: \"738d0c22-ee9e-4e56-9fce-df1a40eb2b00\") " pod="calico-system/goldmane-54d579b49d-bjqdk" Sep 12 00:37:17.302742 kubelet[2932]: I0912 00:37:17.302687 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdx29\" (UniqueName: \"kubernetes.io/projected/5cff39d8-e5be-4f52-9e94-587e9399478a-kube-api-access-zdx29\") pod \"calico-apiserver-797c9d4dc7-9jk4g\" (UID: \"5cff39d8-e5be-4f52-9e94-587e9399478a\") " pod="calico-apiserver/calico-apiserver-797c9d4dc7-9jk4g" Sep 12 00:37:17.373972 containerd[1615]: time="2025-09-12T00:37:17.373912373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mbnfj,Uid:a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58,Namespace:kube-system,Attempt:0,}" Sep 12 00:37:17.415966 containerd[1615]: time="2025-09-12T00:37:17.415918715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5f865bb855-x8l6w,Uid:4ced77e4-35b4-46cb-87ba-9ce55f472d52,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:17.425173 containerd[1615]: time="2025-09-12T00:37:17.425146251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tkh46,Uid:72ef27f4-5e10-4386-b23d-7206e0e9d085,Namespace:kube-system,Attempt:0,}" Sep 12 00:37:17.437127 containerd[1615]: time="2025-09-12T00:37:17.437096728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7585d978b7-fk9q8,Uid:79ef442e-2a69-41e8-b511-65438149e24e,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:17.447533 containerd[1615]: time="2025-09-12T00:37:17.447474453Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 00:37:17.527990 containerd[1615]: time="2025-09-12T00:37:17.527937233Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797c9d4dc7-nwcb6,Uid:dc0e806a-7a27-438c-b07b-5454371c66d4,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:37:17.556915 containerd[1615]: time="2025-09-12T00:37:17.556833982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8665c66cb-hjgv7,Uid:9e9d3bf8-503a-4ac7-8753-ceaf23928a49,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:37:17.559109 containerd[1615]: time="2025-09-12T00:37:17.559091921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bjqdk,Uid:738d0c22-ee9e-4e56-9fce-df1a40eb2b00,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:17.570426 containerd[1615]: time="2025-09-12T00:37:17.570374246Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797c9d4dc7-9jk4g,Uid:5cff39d8-e5be-4f52-9e94-587e9399478a,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:37:17.692583 containerd[1615]: time="2025-09-12T00:37:17.692410482Z" level=error msg="Failed to destroy network for sandbox \"3e72144e565f6fa40a5c1fa7273bc16edb17392c14b7042658a63fbf9444c327\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.693598 containerd[1615]: time="2025-09-12T00:37:17.693575378Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797c9d4dc7-nwcb6,Uid:dc0e806a-7a27-438c-b07b-5454371c66d4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e72144e565f6fa40a5c1fa7273bc16edb17392c14b7042658a63fbf9444c327\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.693865 kubelet[2932]: E0912 00:37:17.693838 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e72144e565f6fa40a5c1fa7273bc16edb17392c14b7042658a63fbf9444c327\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.693917 kubelet[2932]: E0912 00:37:17.693885 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e72144e565f6fa40a5c1fa7273bc16edb17392c14b7042658a63fbf9444c327\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-797c9d4dc7-nwcb6" Sep 12 00:37:17.693917 kubelet[2932]: E0912 00:37:17.693903 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e72144e565f6fa40a5c1fa7273bc16edb17392c14b7042658a63fbf9444c327\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-797c9d4dc7-nwcb6" Sep 12 00:37:17.693965 kubelet[2932]: E0912 00:37:17.693947 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-797c9d4dc7-nwcb6_calico-apiserver(dc0e806a-7a27-438c-b07b-5454371c66d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-797c9d4dc7-nwcb6_calico-apiserver(dc0e806a-7a27-438c-b07b-5454371c66d4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3e72144e565f6fa40a5c1fa7273bc16edb17392c14b7042658a63fbf9444c327\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-797c9d4dc7-nwcb6" podUID="dc0e806a-7a27-438c-b07b-5454371c66d4" Sep 12 00:37:17.707324 containerd[1615]: time="2025-09-12T00:37:17.707229584Z" level=error msg="Failed to destroy network for sandbox \"3494f12be8e43b0070b3370a6898bd5485a53e2f7838280ba9e5ddc1729b6542\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.708781 containerd[1615]: time="2025-09-12T00:37:17.708571909Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bjqdk,Uid:738d0c22-ee9e-4e56-9fce-df1a40eb2b00,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3494f12be8e43b0070b3370a6898bd5485a53e2f7838280ba9e5ddc1729b6542\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.708904 kubelet[2932]: E0912 00:37:17.708716 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3494f12be8e43b0070b3370a6898bd5485a53e2f7838280ba9e5ddc1729b6542\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.708904 kubelet[2932]: E0912 00:37:17.708756 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3494f12be8e43b0070b3370a6898bd5485a53e2f7838280ba9e5ddc1729b6542\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-bjqdk" Sep 12 00:37:17.708904 kubelet[2932]: E0912 00:37:17.708770 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3494f12be8e43b0070b3370a6898bd5485a53e2f7838280ba9e5ddc1729b6542\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-bjqdk" Sep 12 00:37:17.709187 kubelet[2932]: E0912 00:37:17.708842 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-bjqdk_calico-system(738d0c22-ee9e-4e56-9fce-df1a40eb2b00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-bjqdk_calico-system(738d0c22-ee9e-4e56-9fce-df1a40eb2b00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3494f12be8e43b0070b3370a6898bd5485a53e2f7838280ba9e5ddc1729b6542\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-bjqdk" podUID="738d0c22-ee9e-4e56-9fce-df1a40eb2b00" Sep 12 00:37:17.729294 containerd[1615]: time="2025-09-12T00:37:17.729198582Z" level=error msg="Failed to destroy network for sandbox \"291314a8e1d2722dd26fa0c5c676009531c7ed5af0837cfe0b48bda03496e07e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.733665 containerd[1615]: time="2025-09-12T00:37:17.733581944Z" level=error msg="Failed to destroy network for sandbox \"75b2e64e5af7f8cf6b8a5c84c483abb36b44a8d6c86e3137aee019eb0140e47e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.738050 containerd[1615]: time="2025-09-12T00:37:17.737884721Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4lfjf,Uid:3a3a008e-e1fb-47cc-bd0a-8ff12488e165,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"291314a8e1d2722dd26fa0c5c676009531c7ed5af0837cfe0b48bda03496e07e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.738990 kubelet[2932]: E0912 00:37:17.738270 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"291314a8e1d2722dd26fa0c5c676009531c7ed5af0837cfe0b48bda03496e07e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.738990 kubelet[2932]: E0912 00:37:17.738312 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"291314a8e1d2722dd26fa0c5c676009531c7ed5af0837cfe0b48bda03496e07e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-4lfjf" Sep 12 00:37:17.738990 kubelet[2932]: E0912 00:37:17.738332 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"291314a8e1d2722dd26fa0c5c676009531c7ed5af0837cfe0b48bda03496e07e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-4lfjf" Sep 12 00:37:17.739096 kubelet[2932]: E0912 00:37:17.738374 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-4lfjf_calico-system(3a3a008e-e1fb-47cc-bd0a-8ff12488e165)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-4lfjf_calico-system(3a3a008e-e1fb-47cc-bd0a-8ff12488e165)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"291314a8e1d2722dd26fa0c5c676009531c7ed5af0837cfe0b48bda03496e07e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-4lfjf" podUID="3a3a008e-e1fb-47cc-bd0a-8ff12488e165" Sep 12 00:37:17.739418 containerd[1615]: time="2025-09-12T00:37:17.739351670Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8665c66cb-hjgv7,Uid:9e9d3bf8-503a-4ac7-8753-ceaf23928a49,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"75b2e64e5af7f8cf6b8a5c84c483abb36b44a8d6c86e3137aee019eb0140e47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.740186 kubelet[2932]: E0912 00:37:17.739471 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75b2e64e5af7f8cf6b8a5c84c483abb36b44a8d6c86e3137aee019eb0140e47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.740343 kubelet[2932]: E0912 00:37:17.740309 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75b2e64e5af7f8cf6b8a5c84c483abb36b44a8d6c86e3137aee019eb0140e47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8665c66cb-hjgv7" Sep 12 00:37:17.740343 kubelet[2932]: E0912 00:37:17.740335 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75b2e64e5af7f8cf6b8a5c84c483abb36b44a8d6c86e3137aee019eb0140e47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8665c66cb-hjgv7" Sep 12 00:37:17.740480 kubelet[2932]: E0912 00:37:17.740379 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8665c66cb-hjgv7_calico-apiserver(9e9d3bf8-503a-4ac7-8753-ceaf23928a49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8665c66cb-hjgv7_calico-apiserver(9e9d3bf8-503a-4ac7-8753-ceaf23928a49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"75b2e64e5af7f8cf6b8a5c84c483abb36b44a8d6c86e3137aee019eb0140e47e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8665c66cb-hjgv7" podUID="9e9d3bf8-503a-4ac7-8753-ceaf23928a49" Sep 12 00:37:17.742701 containerd[1615]: time="2025-09-12T00:37:17.742659141Z" level=error msg="Failed to destroy network for sandbox \"1d4d19af38135face0dcca41b93cec539c4d0ed43994c7c43771d7995558e8ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.745533 containerd[1615]: time="2025-09-12T00:37:17.744864672Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797c9d4dc7-9jk4g,Uid:5cff39d8-e5be-4f52-9e94-587e9399478a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d4d19af38135face0dcca41b93cec539c4d0ed43994c7c43771d7995558e8ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.745670 kubelet[2932]: E0912 00:37:17.745603 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d4d19af38135face0dcca41b93cec539c4d0ed43994c7c43771d7995558e8ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.745670 kubelet[2932]: E0912 00:37:17.745641 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d4d19af38135face0dcca41b93cec539c4d0ed43994c7c43771d7995558e8ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-797c9d4dc7-9jk4g" Sep 12 00:37:17.745670 kubelet[2932]: E0912 00:37:17.745652 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d4d19af38135face0dcca41b93cec539c4d0ed43994c7c43771d7995558e8ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-797c9d4dc7-9jk4g" Sep 12 00:37:17.746211 kubelet[2932]: E0912 00:37:17.745684 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-797c9d4dc7-9jk4g_calico-apiserver(5cff39d8-e5be-4f52-9e94-587e9399478a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-797c9d4dc7-9jk4g_calico-apiserver(5cff39d8-e5be-4f52-9e94-587e9399478a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d4d19af38135face0dcca41b93cec539c4d0ed43994c7c43771d7995558e8ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-797c9d4dc7-9jk4g" podUID="5cff39d8-e5be-4f52-9e94-587e9399478a" Sep 12 00:37:17.750877 containerd[1615]: time="2025-09-12T00:37:17.750840213Z" level=error msg="Failed to destroy network for sandbox \"99c29bea6645f5944d9ad27af64c8d6ee8a59048bd4e6889644d95bf09102a41\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.751522 containerd[1615]: time="2025-09-12T00:37:17.751492313Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5f865bb855-x8l6w,Uid:4ced77e4-35b4-46cb-87ba-9ce55f472d52,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"99c29bea6645f5944d9ad27af64c8d6ee8a59048bd4e6889644d95bf09102a41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.751826 kubelet[2932]: E0912 00:37:17.751799 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99c29bea6645f5944d9ad27af64c8d6ee8a59048bd4e6889644d95bf09102a41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.752308 kubelet[2932]: E0912 00:37:17.751911 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99c29bea6645f5944d9ad27af64c8d6ee8a59048bd4e6889644d95bf09102a41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5f865bb855-x8l6w" Sep 12 00:37:17.752308 kubelet[2932]: E0912 00:37:17.751930 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99c29bea6645f5944d9ad27af64c8d6ee8a59048bd4e6889644d95bf09102a41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5f865bb855-x8l6w" Sep 12 00:37:17.752308 kubelet[2932]: E0912 00:37:17.751991 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5f865bb855-x8l6w_calico-system(4ced77e4-35b4-46cb-87ba-9ce55f472d52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5f865bb855-x8l6w_calico-system(4ced77e4-35b4-46cb-87ba-9ce55f472d52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"99c29bea6645f5944d9ad27af64c8d6ee8a59048bd4e6889644d95bf09102a41\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5f865bb855-x8l6w" podUID="4ced77e4-35b4-46cb-87ba-9ce55f472d52" Sep 12 00:37:17.758351 containerd[1615]: time="2025-09-12T00:37:17.758322903Z" level=error msg="Failed to destroy network for sandbox \"3023305b77ac2c5e93e0ba4b805842977166b24e821d23a0a67eb84ca94bfcc0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.762297 containerd[1615]: time="2025-09-12T00:37:17.759626757Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tkh46,Uid:72ef27f4-5e10-4386-b23d-7206e0e9d085,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3023305b77ac2c5e93e0ba4b805842977166b24e821d23a0a67eb84ca94bfcc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.762375 kubelet[2932]: E0912 00:37:17.759784 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3023305b77ac2c5e93e0ba4b805842977166b24e821d23a0a67eb84ca94bfcc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.762375 kubelet[2932]: E0912 00:37:17.759823 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3023305b77ac2c5e93e0ba4b805842977166b24e821d23a0a67eb84ca94bfcc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-tkh46" Sep 12 00:37:17.762375 kubelet[2932]: E0912 00:37:17.759843 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3023305b77ac2c5e93e0ba4b805842977166b24e821d23a0a67eb84ca94bfcc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-tkh46" Sep 12 00:37:17.762487 kubelet[2932]: E0912 00:37:17.759889 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-tkh46_kube-system(72ef27f4-5e10-4386-b23d-7206e0e9d085)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-tkh46_kube-system(72ef27f4-5e10-4386-b23d-7206e0e9d085)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3023305b77ac2c5e93e0ba4b805842977166b24e821d23a0a67eb84ca94bfcc0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-tkh46" podUID="72ef27f4-5e10-4386-b23d-7206e0e9d085" Sep 12 00:37:17.764384 containerd[1615]: time="2025-09-12T00:37:17.764356127Z" level=error msg="Failed to destroy network for sandbox \"cb1fd7587c15b74b3bba1f88a764374e960c581cf4272862b491cf0225863722\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.764828 containerd[1615]: time="2025-09-12T00:37:17.764808905Z" level=error msg="Failed to destroy network for sandbox \"4bdaf45d88055fbf1259f2f4660f8c4c3e088e67a1b516fee03a10f01ce8532d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.764961 containerd[1615]: time="2025-09-12T00:37:17.764936997Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mbnfj,Uid:a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb1fd7587c15b74b3bba1f88a764374e960c581cf4272862b491cf0225863722\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.765143 kubelet[2932]: E0912 00:37:17.765107 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb1fd7587c15b74b3bba1f88a764374e960c581cf4272862b491cf0225863722\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.765193 kubelet[2932]: E0912 00:37:17.765154 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb1fd7587c15b74b3bba1f88a764374e960c581cf4272862b491cf0225863722\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-mbnfj" Sep 12 00:37:17.765193 kubelet[2932]: E0912 00:37:17.765167 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb1fd7587c15b74b3bba1f88a764374e960c581cf4272862b491cf0225863722\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-mbnfj" Sep 12 00:37:17.765575 kubelet[2932]: E0912 00:37:17.765199 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-mbnfj_kube-system(a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-mbnfj_kube-system(a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cb1fd7587c15b74b3bba1f88a764374e960c581cf4272862b491cf0225863722\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-mbnfj" podUID="a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58" Sep 12 00:37:17.765703 containerd[1615]: time="2025-09-12T00:37:17.765445347Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7585d978b7-fk9q8,Uid:79ef442e-2a69-41e8-b511-65438149e24e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bdaf45d88055fbf1259f2f4660f8c4c3e088e67a1b516fee03a10f01ce8532d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.765849 kubelet[2932]: E0912 00:37:17.765831 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bdaf45d88055fbf1259f2f4660f8c4c3e088e67a1b516fee03a10f01ce8532d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:37:17.765896 kubelet[2932]: E0912 00:37:17.765851 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bdaf45d88055fbf1259f2f4660f8c4c3e088e67a1b516fee03a10f01ce8532d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7585d978b7-fk9q8" Sep 12 00:37:17.765896 kubelet[2932]: E0912 00:37:17.765862 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bdaf45d88055fbf1259f2f4660f8c4c3e088e67a1b516fee03a10f01ce8532d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7585d978b7-fk9q8" Sep 12 00:37:17.765952 kubelet[2932]: E0912 00:37:17.765891 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7585d978b7-fk9q8_calico-system(79ef442e-2a69-41e8-b511-65438149e24e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7585d978b7-fk9q8_calico-system(79ef442e-2a69-41e8-b511-65438149e24e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4bdaf45d88055fbf1259f2f4660f8c4c3e088e67a1b516fee03a10f01ce8532d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7585d978b7-fk9q8" podUID="79ef442e-2a69-41e8-b511-65438149e24e" Sep 12 00:37:17.950761 systemd[1]: run-netns-cni\x2da0d3ab12\x2d4563\x2dd7b8\x2de511\x2d09af1b31bf1e.mount: Deactivated successfully. Sep 12 00:37:17.951066 systemd[1]: run-netns-cni\x2d7ffbf208\x2d458c\x2d63d0\x2dbcad\x2d303d78f021c9.mount: Deactivated successfully. Sep 12 00:37:24.052083 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3187889306.mount: Deactivated successfully. Sep 12 00:37:24.226132 containerd[1615]: time="2025-09-12T00:37:24.226092488Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:24.248090 containerd[1615]: time="2025-09-12T00:37:24.248046998Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 00:37:24.250850 containerd[1615]: time="2025-09-12T00:37:24.250185707Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:24.254274 containerd[1615]: time="2025-09-12T00:37:24.253239235Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:24.256849 containerd[1615]: time="2025-09-12T00:37:24.256823281Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.808612308s" Sep 12 00:37:24.258566 containerd[1615]: time="2025-09-12T00:37:24.256981698Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 00:37:24.358745 containerd[1615]: time="2025-09-12T00:37:24.358645665Z" level=info msg="CreateContainer within sandbox \"8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 00:37:24.530093 containerd[1615]: time="2025-09-12T00:37:24.529865113Z" level=info msg="Container baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:24.530078 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4137106051.mount: Deactivated successfully. Sep 12 00:37:24.589059 containerd[1615]: time="2025-09-12T00:37:24.588955283Z" level=info msg="CreateContainer within sandbox \"8b0b991695e292f6f43ac6695df36c874b128ad359cd645bef9bcc28fa7456d7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee\"" Sep 12 00:37:24.594486 containerd[1615]: time="2025-09-12T00:37:24.589788623Z" level=info msg="StartContainer for \"baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee\"" Sep 12 00:37:24.612207 containerd[1615]: time="2025-09-12T00:37:24.612139860Z" level=info msg="connecting to shim baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee" address="unix:///run/containerd/s/eff254af7a4e19a209bab82ce44e48ee2125ebe104732493a462a99cec16ffca" protocol=ttrpc version=3 Sep 12 00:37:24.770338 systemd[1]: Started cri-containerd-baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee.scope - libcontainer container baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee. Sep 12 00:37:24.827603 containerd[1615]: time="2025-09-12T00:37:24.827563985Z" level=info msg="StartContainer for \"baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee\" returns successfully" Sep 12 00:37:25.433483 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 00:37:25.461302 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 00:37:25.865768 containerd[1615]: time="2025-09-12T00:37:25.865377809Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee\" id:\"f6238d2938f9f7f76d53b461c81c751da587e23921b4469c54099346b2cfabb5\" pid:4048 exit_status:1 exited_at:{seconds:1757637445 nanos:863158506}" Sep 12 00:37:26.305577 kubelet[2932]: I0912 00:37:26.301024 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-rrxj8" podStartSLOduration=3.588466241 podStartE2EDuration="22.3010089s" podCreationTimestamp="2025-09-12 00:37:04 +0000 UTC" firstStartedPulling="2025-09-12 00:37:05.545166866 +0000 UTC m=+20.576589076" lastFinishedPulling="2025-09-12 00:37:24.257709525 +0000 UTC m=+39.289131735" observedRunningTime="2025-09-12 00:37:25.529263029 +0000 UTC m=+40.560685248" watchObservedRunningTime="2025-09-12 00:37:26.3010089 +0000 UTC m=+41.332431114" Sep 12 00:37:26.359396 kubelet[2932]: I0912 00:37:26.359364 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4ced77e4-35b4-46cb-87ba-9ce55f472d52-whisker-backend-key-pair\") pod \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\" (UID: \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\") " Sep 12 00:37:26.359691 kubelet[2932]: I0912 00:37:26.359497 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hc2d\" (UniqueName: \"kubernetes.io/projected/4ced77e4-35b4-46cb-87ba-9ce55f472d52-kube-api-access-8hc2d\") pod \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\" (UID: \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\") " Sep 12 00:37:26.359691 kubelet[2932]: I0912 00:37:26.359600 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ced77e4-35b4-46cb-87ba-9ce55f472d52-whisker-ca-bundle\") pod \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\" (UID: \"4ced77e4-35b4-46cb-87ba-9ce55f472d52\") " Sep 12 00:37:26.407381 kubelet[2932]: I0912 00:37:26.407349 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ced77e4-35b4-46cb-87ba-9ce55f472d52-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "4ced77e4-35b4-46cb-87ba-9ce55f472d52" (UID: "4ced77e4-35b4-46cb-87ba-9ce55f472d52"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 00:37:26.452857 systemd[1]: var-lib-kubelet-pods-4ced77e4\x2d35b4\x2d46cb\x2d87ba\x2d9ce55f472d52-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 00:37:26.453190 kubelet[2932]: I0912 00:37:26.452950 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ced77e4-35b4-46cb-87ba-9ce55f472d52-kube-api-access-8hc2d" (OuterVolumeSpecName: "kube-api-access-8hc2d") pod "4ced77e4-35b4-46cb-87ba-9ce55f472d52" (UID: "4ced77e4-35b4-46cb-87ba-9ce55f472d52"). InnerVolumeSpecName "kube-api-access-8hc2d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 00:37:26.453153 systemd[1]: var-lib-kubelet-pods-4ced77e4\x2d35b4\x2d46cb\x2d87ba\x2d9ce55f472d52-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8hc2d.mount: Deactivated successfully. Sep 12 00:37:26.455525 kubelet[2932]: I0912 00:37:26.453341 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ced77e4-35b4-46cb-87ba-9ce55f472d52-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "4ced77e4-35b4-46cb-87ba-9ce55f472d52" (UID: "4ced77e4-35b4-46cb-87ba-9ce55f472d52"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 00:37:26.460995 kubelet[2932]: I0912 00:37:26.460970 2932 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4ced77e4-35b4-46cb-87ba-9ce55f472d52-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 00:37:26.461112 kubelet[2932]: I0912 00:37:26.461105 2932 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8hc2d\" (UniqueName: \"kubernetes.io/projected/4ced77e4-35b4-46cb-87ba-9ce55f472d52-kube-api-access-8hc2d\") on node \"localhost\" DevicePath \"\"" Sep 12 00:37:26.462175 kubelet[2932]: I0912 00:37:26.461498 2932 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ced77e4-35b4-46cb-87ba-9ce55f472d52-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 00:37:26.529377 systemd[1]: Removed slice kubepods-besteffort-pod4ced77e4_35b4_46cb_87ba_9ce55f472d52.slice - libcontainer container kubepods-besteffort-pod4ced77e4_35b4_46cb_87ba_9ce55f472d52.slice. Sep 12 00:37:26.612385 containerd[1615]: time="2025-09-12T00:37:26.612283225Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee\" id:\"0c492aa1563960ec6f471e0a4956cbb3120391be02f5480d71fe6879f60002f0\" pid:4092 exit_status:1 exited_at:{seconds:1757637446 nanos:611969393}" Sep 12 00:37:26.713813 systemd[1]: Created slice kubepods-besteffort-pode7b31edc_f752_4949_bd99_d137bb6dfdd7.slice - libcontainer container kubepods-besteffort-pode7b31edc_f752_4949_bd99_d137bb6dfdd7.slice. Sep 12 00:37:26.767439 kubelet[2932]: I0912 00:37:26.767415 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkzf2\" (UniqueName: \"kubernetes.io/projected/e7b31edc-f752-4949-bd99-d137bb6dfdd7-kube-api-access-tkzf2\") pod \"whisker-6d4d9c456b-k8224\" (UID: \"e7b31edc-f752-4949-bd99-d137bb6dfdd7\") " pod="calico-system/whisker-6d4d9c456b-k8224" Sep 12 00:37:26.773757 kubelet[2932]: I0912 00:37:26.767579 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e7b31edc-f752-4949-bd99-d137bb6dfdd7-whisker-backend-key-pair\") pod \"whisker-6d4d9c456b-k8224\" (UID: \"e7b31edc-f752-4949-bd99-d137bb6dfdd7\") " pod="calico-system/whisker-6d4d9c456b-k8224" Sep 12 00:37:26.773757 kubelet[2932]: I0912 00:37:26.767594 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7b31edc-f752-4949-bd99-d137bb6dfdd7-whisker-ca-bundle\") pod \"whisker-6d4d9c456b-k8224\" (UID: \"e7b31edc-f752-4949-bd99-d137bb6dfdd7\") " pod="calico-system/whisker-6d4d9c456b-k8224" Sep 12 00:37:27.019219 containerd[1615]: time="2025-09-12T00:37:27.019123184Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d4d9c456b-k8224,Uid:e7b31edc-f752-4949-bd99-d137bb6dfdd7,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:27.072227 kubelet[2932]: I0912 00:37:27.072197 2932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ced77e4-35b4-46cb-87ba-9ce55f472d52" path="/var/lib/kubelet/pods/4ced77e4-35b4-46cb-87ba-9ce55f472d52/volumes" Sep 12 00:37:28.140025 systemd-networkd[1536]: cali478cdfafa9e: Link UP Sep 12 00:37:28.140676 systemd-networkd[1536]: cali478cdfafa9e: Gained carrier Sep 12 00:37:28.165526 containerd[1615]: 2025-09-12 00:37:27.057 [INFO][4105] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 00:37:28.165526 containerd[1615]: 2025-09-12 00:37:27.274 [INFO][4105] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--6d4d9c456b--k8224-eth0 whisker-6d4d9c456b- calico-system e7b31edc-f752-4949-bd99-d137bb6dfdd7 908 0 2025-09-12 00:37:26 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6d4d9c456b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-6d4d9c456b-k8224 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali478cdfafa9e [] [] }} ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Namespace="calico-system" Pod="whisker-6d4d9c456b-k8224" WorkloadEndpoint="localhost-k8s-whisker--6d4d9c456b--k8224-" Sep 12 00:37:28.165526 containerd[1615]: 2025-09-12 00:37:27.275 [INFO][4105] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Namespace="calico-system" Pod="whisker-6d4d9c456b-k8224" WorkloadEndpoint="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" Sep 12 00:37:28.165526 containerd[1615]: 2025-09-12 00:37:27.955 [INFO][4203] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" HandleID="k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Workload="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:27.958 [INFO][4203] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" HandleID="k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Workload="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000600270), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-6d4d9c456b-k8224", "timestamp":"2025-09-12 00:37:27.955382902 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:27.958 [INFO][4203] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:27.959 [INFO][4203] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:27.959 [INFO][4203] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:28.050 [INFO][4203] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" host="localhost" Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:28.071 [INFO][4203] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:28.075 [INFO][4203] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:28.076 [INFO][4203] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:28.078 [INFO][4203] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:28.167163 containerd[1615]: 2025-09-12 00:37:28.078 [INFO][4203] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" host="localhost" Sep 12 00:37:28.168644 containerd[1615]: 2025-09-12 00:37:28.079 [INFO][4203] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903 Sep 12 00:37:28.168644 containerd[1615]: 2025-09-12 00:37:28.082 [INFO][4203] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" host="localhost" Sep 12 00:37:28.168644 containerd[1615]: 2025-09-12 00:37:28.086 [INFO][4203] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" host="localhost" Sep 12 00:37:28.168644 containerd[1615]: 2025-09-12 00:37:28.086 [INFO][4203] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" host="localhost" Sep 12 00:37:28.168644 containerd[1615]: 2025-09-12 00:37:28.086 [INFO][4203] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:28.168644 containerd[1615]: 2025-09-12 00:37:28.086 [INFO][4203] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" HandleID="k8s-pod-network.d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Workload="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" Sep 12 00:37:28.169801 containerd[1615]: 2025-09-12 00:37:28.088 [INFO][4105] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Namespace="calico-system" Pod="whisker-6d4d9c456b-k8224" WorkloadEndpoint="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--6d4d9c456b--k8224-eth0", GenerateName:"whisker-6d4d9c456b-", Namespace:"calico-system", SelfLink:"", UID:"e7b31edc-f752-4949-bd99-d137bb6dfdd7", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6d4d9c456b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-6d4d9c456b-k8224", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali478cdfafa9e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:28.169801 containerd[1615]: 2025-09-12 00:37:28.088 [INFO][4105] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Namespace="calico-system" Pod="whisker-6d4d9c456b-k8224" WorkloadEndpoint="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" Sep 12 00:37:28.169957 containerd[1615]: 2025-09-12 00:37:28.088 [INFO][4105] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali478cdfafa9e ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Namespace="calico-system" Pod="whisker-6d4d9c456b-k8224" WorkloadEndpoint="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" Sep 12 00:37:28.169957 containerd[1615]: 2025-09-12 00:37:28.148 [INFO][4105] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Namespace="calico-system" Pod="whisker-6d4d9c456b-k8224" WorkloadEndpoint="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" Sep 12 00:37:28.170006 containerd[1615]: 2025-09-12 00:37:28.148 [INFO][4105] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Namespace="calico-system" Pod="whisker-6d4d9c456b-k8224" WorkloadEndpoint="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--6d4d9c456b--k8224-eth0", GenerateName:"whisker-6d4d9c456b-", Namespace:"calico-system", SelfLink:"", UID:"e7b31edc-f752-4949-bd99-d137bb6dfdd7", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6d4d9c456b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903", Pod:"whisker-6d4d9c456b-k8224", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali478cdfafa9e", MAC:"fa:9b:55:53:36:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:28.170068 containerd[1615]: 2025-09-12 00:37:28.161 [INFO][4105] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" Namespace="calico-system" Pod="whisker-6d4d9c456b-k8224" WorkloadEndpoint="localhost-k8s-whisker--6d4d9c456b--k8224-eth0" Sep 12 00:37:28.289930 containerd[1615]: time="2025-09-12T00:37:28.289884741Z" level=info msg="connecting to shim d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903" address="unix:///run/containerd/s/2529c0a702568fa20af64464287d9805524c4264acc82f81760cab23ae711272" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:28.311443 systemd[1]: Started cri-containerd-d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903.scope - libcontainer container d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903. Sep 12 00:37:28.322714 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:28.363085 containerd[1615]: time="2025-09-12T00:37:28.363049890Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d4d9c456b-k8224,Uid:e7b31edc-f752-4949-bd99-d137bb6dfdd7,Namespace:calico-system,Attempt:0,} returns sandbox id \"d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903\"" Sep 12 00:37:28.396440 containerd[1615]: time="2025-09-12T00:37:28.395595106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 00:37:29.071278 containerd[1615]: time="2025-09-12T00:37:29.071099490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7585d978b7-fk9q8,Uid:79ef442e-2a69-41e8-b511-65438149e24e,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:29.071543 containerd[1615]: time="2025-09-12T00:37:29.071459572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bjqdk,Uid:738d0c22-ee9e-4e56-9fce-df1a40eb2b00,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:29.301767 systemd-networkd[1536]: calib9ec9209c8c: Link UP Sep 12 00:37:29.301966 systemd-networkd[1536]: calib9ec9209c8c: Gained carrier Sep 12 00:37:29.325843 containerd[1615]: 2025-09-12 00:37:29.175 [INFO][4300] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 00:37:29.325843 containerd[1615]: 2025-09-12 00:37:29.182 [INFO][4300] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--bjqdk-eth0 goldmane-54d579b49d- calico-system 738d0c22-ee9e-4e56-9fce-df1a40eb2b00 836 0 2025-09-12 00:37:04 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-bjqdk eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib9ec9209c8c [] [] }} ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Namespace="calico-system" Pod="goldmane-54d579b49d-bjqdk" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bjqdk-" Sep 12 00:37:29.325843 containerd[1615]: 2025-09-12 00:37:29.182 [INFO][4300] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Namespace="calico-system" Pod="goldmane-54d579b49d-bjqdk" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" Sep 12 00:37:29.325843 containerd[1615]: 2025-09-12 00:37:29.213 [INFO][4316] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" HandleID="k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Workload="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.213 [INFO][4316] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" HandleID="k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Workload="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad6e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-bjqdk", "timestamp":"2025-09-12 00:37:29.213201805 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.213 [INFO][4316] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.213 [INFO][4316] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.213 [INFO][4316] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.219 [INFO][4316] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" host="localhost" Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.222 [INFO][4316] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.225 [INFO][4316] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.228 [INFO][4316] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.252 [INFO][4316] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:29.326210 containerd[1615]: 2025-09-12 00:37:29.254 [INFO][4316] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" host="localhost" Sep 12 00:37:29.326563 containerd[1615]: 2025-09-12 00:37:29.255 [INFO][4316] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332 Sep 12 00:37:29.326563 containerd[1615]: 2025-09-12 00:37:29.265 [INFO][4316] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" host="localhost" Sep 12 00:37:29.326563 containerd[1615]: 2025-09-12 00:37:29.286 [INFO][4316] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" host="localhost" Sep 12 00:37:29.326563 containerd[1615]: 2025-09-12 00:37:29.286 [INFO][4316] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" host="localhost" Sep 12 00:37:29.326563 containerd[1615]: 2025-09-12 00:37:29.286 [INFO][4316] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:29.326563 containerd[1615]: 2025-09-12 00:37:29.286 [INFO][4316] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" HandleID="k8s-pod-network.2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Workload="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" Sep 12 00:37:29.328884 containerd[1615]: 2025-09-12 00:37:29.291 [INFO][4300] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Namespace="calico-system" Pod="goldmane-54d579b49d-bjqdk" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--bjqdk-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"738d0c22-ee9e-4e56-9fce-df1a40eb2b00", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-bjqdk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib9ec9209c8c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:29.328884 containerd[1615]: 2025-09-12 00:37:29.299 [INFO][4300] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Namespace="calico-system" Pod="goldmane-54d579b49d-bjqdk" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" Sep 12 00:37:29.329605 containerd[1615]: 2025-09-12 00:37:29.299 [INFO][4300] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib9ec9209c8c ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Namespace="calico-system" Pod="goldmane-54d579b49d-bjqdk" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" Sep 12 00:37:29.329605 containerd[1615]: 2025-09-12 00:37:29.301 [INFO][4300] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Namespace="calico-system" Pod="goldmane-54d579b49d-bjqdk" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" Sep 12 00:37:29.329656 containerd[1615]: 2025-09-12 00:37:29.302 [INFO][4300] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Namespace="calico-system" Pod="goldmane-54d579b49d-bjqdk" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--bjqdk-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"738d0c22-ee9e-4e56-9fce-df1a40eb2b00", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332", Pod:"goldmane-54d579b49d-bjqdk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib9ec9209c8c", MAC:"86:8f:96:9c:bb:5e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:29.329737 containerd[1615]: 2025-09-12 00:37:29.319 [INFO][4300] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" Namespace="calico-system" Pod="goldmane-54d579b49d-bjqdk" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--bjqdk-eth0" Sep 12 00:37:29.356801 containerd[1615]: time="2025-09-12T00:37:29.356697957Z" level=info msg="connecting to shim 2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332" address="unix:///run/containerd/s/3d7c4987941d48c8e2801b8b5582ce845a1f5e6f8ff9baf52011b2362db71cd9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:29.370881 systemd-networkd[1536]: cali557576bac89: Link UP Sep 12 00:37:29.371419 systemd-networkd[1536]: cali557576bac89: Gained carrier Sep 12 00:37:29.393306 containerd[1615]: 2025-09-12 00:37:29.137 [INFO][4288] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 00:37:29.393306 containerd[1615]: 2025-09-12 00:37:29.175 [INFO][4288] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0 calico-kube-controllers-7585d978b7- calico-system 79ef442e-2a69-41e8-b511-65438149e24e 833 0 2025-09-12 00:37:05 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7585d978b7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7585d978b7-fk9q8 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali557576bac89 [] [] }} ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Namespace="calico-system" Pod="calico-kube-controllers-7585d978b7-fk9q8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-" Sep 12 00:37:29.393306 containerd[1615]: 2025-09-12 00:37:29.175 [INFO][4288] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Namespace="calico-system" Pod="calico-kube-controllers-7585d978b7-fk9q8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" Sep 12 00:37:29.393306 containerd[1615]: 2025-09-12 00:37:29.230 [INFO][4312] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" HandleID="k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Workload="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.233 [INFO][4312] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" HandleID="k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Workload="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd950), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7585d978b7-fk9q8", "timestamp":"2025-09-12 00:37:29.230752572 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.233 [INFO][4312] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.286 [INFO][4312] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.286 [INFO][4312] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.322 [INFO][4312] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" host="localhost" Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.330 [INFO][4312] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.337 [INFO][4312] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.342 [INFO][4312] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.346 [INFO][4312] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:29.393639 containerd[1615]: 2025-09-12 00:37:29.346 [INFO][4312] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" host="localhost" Sep 12 00:37:29.399713 containerd[1615]: 2025-09-12 00:37:29.347 [INFO][4312] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8 Sep 12 00:37:29.399713 containerd[1615]: 2025-09-12 00:37:29.351 [INFO][4312] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" host="localhost" Sep 12 00:37:29.399713 containerd[1615]: 2025-09-12 00:37:29.365 [INFO][4312] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" host="localhost" Sep 12 00:37:29.399713 containerd[1615]: 2025-09-12 00:37:29.365 [INFO][4312] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" host="localhost" Sep 12 00:37:29.399713 containerd[1615]: 2025-09-12 00:37:29.365 [INFO][4312] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:29.399713 containerd[1615]: 2025-09-12 00:37:29.365 [INFO][4312] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" HandleID="k8s-pod-network.1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Workload="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" Sep 12 00:37:29.393929 systemd[1]: Started cri-containerd-2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332.scope - libcontainer container 2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332. Sep 12 00:37:29.400005 containerd[1615]: 2025-09-12 00:37:29.369 [INFO][4288] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Namespace="calico-system" Pod="calico-kube-controllers-7585d978b7-fk9q8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0", GenerateName:"calico-kube-controllers-7585d978b7-", Namespace:"calico-system", SelfLink:"", UID:"79ef442e-2a69-41e8-b511-65438149e24e", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7585d978b7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7585d978b7-fk9q8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali557576bac89", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:29.400080 containerd[1615]: 2025-09-12 00:37:29.369 [INFO][4288] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Namespace="calico-system" Pod="calico-kube-controllers-7585d978b7-fk9q8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" Sep 12 00:37:29.400080 containerd[1615]: 2025-09-12 00:37:29.369 [INFO][4288] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali557576bac89 ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Namespace="calico-system" Pod="calico-kube-controllers-7585d978b7-fk9q8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" Sep 12 00:37:29.400080 containerd[1615]: 2025-09-12 00:37:29.371 [INFO][4288] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Namespace="calico-system" Pod="calico-kube-controllers-7585d978b7-fk9q8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" Sep 12 00:37:29.400134 containerd[1615]: 2025-09-12 00:37:29.372 [INFO][4288] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Namespace="calico-system" Pod="calico-kube-controllers-7585d978b7-fk9q8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0", GenerateName:"calico-kube-controllers-7585d978b7-", Namespace:"calico-system", SelfLink:"", UID:"79ef442e-2a69-41e8-b511-65438149e24e", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7585d978b7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8", Pod:"calico-kube-controllers-7585d978b7-fk9q8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali557576bac89", MAC:"ae:15:b5:fe:c5:81", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:29.400177 containerd[1615]: 2025-09-12 00:37:29.391 [INFO][4288] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" Namespace="calico-system" Pod="calico-kube-controllers-7585d978b7-fk9q8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7585d978b7--fk9q8-eth0" Sep 12 00:37:29.406565 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:29.452612 containerd[1615]: time="2025-09-12T00:37:29.452565664Z" level=info msg="connecting to shim 1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8" address="unix:///run/containerd/s/831f4d9b4583844d36fd412ab2c598ef94618f8585238df504b3fa17f61bd13c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:29.467045 containerd[1615]: time="2025-09-12T00:37:29.467007315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bjqdk,Uid:738d0c22-ee9e-4e56-9fce-df1a40eb2b00,Namespace:calico-system,Attempt:0,} returns sandbox id \"2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332\"" Sep 12 00:37:29.476993 kubelet[2932]: I0912 00:37:29.476956 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:37:29.491726 systemd[1]: Started cri-containerd-1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8.scope - libcontainer container 1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8. Sep 12 00:37:29.513111 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:29.538383 systemd-networkd[1536]: cali478cdfafa9e: Gained IPv6LL Sep 12 00:37:29.576470 containerd[1615]: time="2025-09-12T00:37:29.576368070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7585d978b7-fk9q8,Uid:79ef442e-2a69-41e8-b511-65438149e24e,Namespace:calico-system,Attempt:0,} returns sandbox id \"1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8\"" Sep 12 00:37:30.045024 containerd[1615]: time="2025-09-12T00:37:30.044995769Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:30.047721 containerd[1615]: time="2025-09-12T00:37:30.047702042Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 00:37:30.048474 containerd[1615]: time="2025-09-12T00:37:30.048445314Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:30.050063 containerd[1615]: time="2025-09-12T00:37:30.050030677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:30.050445 containerd[1615]: time="2025-09-12T00:37:30.050425453Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.65479992s" Sep 12 00:37:30.050493 containerd[1615]: time="2025-09-12T00:37:30.050448021Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 00:37:30.059648 containerd[1615]: time="2025-09-12T00:37:30.059551217Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 00:37:30.075723 containerd[1615]: time="2025-09-12T00:37:30.074262574Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mbnfj,Uid:a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58,Namespace:kube-system,Attempt:0,}" Sep 12 00:37:30.106393 containerd[1615]: time="2025-09-12T00:37:30.106360471Z" level=info msg="CreateContainer within sandbox \"d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 00:37:30.112816 containerd[1615]: time="2025-09-12T00:37:30.112784774Z" level=info msg="Container badfe75d707734a561275e66dad6cbd9349f033130b4b767a807952c5bdd32d4: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:30.123952 containerd[1615]: time="2025-09-12T00:37:30.123909509Z" level=info msg="CreateContainer within sandbox \"d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"badfe75d707734a561275e66dad6cbd9349f033130b4b767a807952c5bdd32d4\"" Sep 12 00:37:30.124783 containerd[1615]: time="2025-09-12T00:37:30.124662523Z" level=info msg="StartContainer for \"badfe75d707734a561275e66dad6cbd9349f033130b4b767a807952c5bdd32d4\"" Sep 12 00:37:30.128766 containerd[1615]: time="2025-09-12T00:37:30.128726497Z" level=info msg="connecting to shim badfe75d707734a561275e66dad6cbd9349f033130b4b767a807952c5bdd32d4" address="unix:///run/containerd/s/2529c0a702568fa20af64464287d9805524c4264acc82f81760cab23ae711272" protocol=ttrpc version=3 Sep 12 00:37:30.156426 systemd[1]: Started cri-containerd-badfe75d707734a561275e66dad6cbd9349f033130b4b767a807952c5bdd32d4.scope - libcontainer container badfe75d707734a561275e66dad6cbd9349f033130b4b767a807952c5bdd32d4. Sep 12 00:37:30.220220 containerd[1615]: time="2025-09-12T00:37:30.220186009Z" level=info msg="StartContainer for \"badfe75d707734a561275e66dad6cbd9349f033130b4b767a807952c5bdd32d4\" returns successfully" Sep 12 00:37:30.227191 systemd-networkd[1536]: cali88ac5a6c44c: Link UP Sep 12 00:37:30.228040 systemd-networkd[1536]: cali88ac5a6c44c: Gained carrier Sep 12 00:37:30.247817 containerd[1615]: 2025-09-12 00:37:30.107 [INFO][4477] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 00:37:30.247817 containerd[1615]: 2025-09-12 00:37:30.127 [INFO][4477] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0 coredns-674b8bbfcf- kube-system a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58 829 0 2025-09-12 00:36:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-mbnfj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali88ac5a6c44c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Namespace="kube-system" Pod="coredns-674b8bbfcf-mbnfj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mbnfj-" Sep 12 00:37:30.247817 containerd[1615]: 2025-09-12 00:37:30.127 [INFO][4477] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Namespace="kube-system" Pod="coredns-674b8bbfcf-mbnfj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" Sep 12 00:37:30.247817 containerd[1615]: 2025-09-12 00:37:30.174 [INFO][4499] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" HandleID="k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Workload="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.174 [INFO][4499] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" HandleID="k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Workload="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c5b40), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-mbnfj", "timestamp":"2025-09-12 00:37:30.174117426 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.174 [INFO][4499] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.174 [INFO][4499] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.174 [INFO][4499] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.183 [INFO][4499] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" host="localhost" Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.187 [INFO][4499] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.190 [INFO][4499] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.192 [INFO][4499] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.193 [INFO][4499] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:30.248054 containerd[1615]: 2025-09-12 00:37:30.193 [INFO][4499] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" host="localhost" Sep 12 00:37:30.248721 containerd[1615]: 2025-09-12 00:37:30.194 [INFO][4499] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4 Sep 12 00:37:30.248721 containerd[1615]: 2025-09-12 00:37:30.203 [INFO][4499] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" host="localhost" Sep 12 00:37:30.248721 containerd[1615]: 2025-09-12 00:37:30.217 [INFO][4499] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" host="localhost" Sep 12 00:37:30.248721 containerd[1615]: 2025-09-12 00:37:30.217 [INFO][4499] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" host="localhost" Sep 12 00:37:30.248721 containerd[1615]: 2025-09-12 00:37:30.217 [INFO][4499] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:30.248721 containerd[1615]: 2025-09-12 00:37:30.217 [INFO][4499] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" HandleID="k8s-pod-network.6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Workload="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" Sep 12 00:37:30.248890 containerd[1615]: 2025-09-12 00:37:30.220 [INFO][4477] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Namespace="kube-system" Pod="coredns-674b8bbfcf-mbnfj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 36, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-mbnfj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali88ac5a6c44c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:30.252399 containerd[1615]: 2025-09-12 00:37:30.220 [INFO][4477] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Namespace="kube-system" Pod="coredns-674b8bbfcf-mbnfj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" Sep 12 00:37:30.252399 containerd[1615]: 2025-09-12 00:37:30.220 [INFO][4477] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali88ac5a6c44c ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Namespace="kube-system" Pod="coredns-674b8bbfcf-mbnfj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" Sep 12 00:37:30.252399 containerd[1615]: 2025-09-12 00:37:30.228 [INFO][4477] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Namespace="kube-system" Pod="coredns-674b8bbfcf-mbnfj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" Sep 12 00:37:30.252506 containerd[1615]: 2025-09-12 00:37:30.229 [INFO][4477] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Namespace="kube-system" Pod="coredns-674b8bbfcf-mbnfj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 36, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4", Pod:"coredns-674b8bbfcf-mbnfj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali88ac5a6c44c", MAC:"3e:e7:6d:23:b3:0b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:30.252506 containerd[1615]: 2025-09-12 00:37:30.244 [INFO][4477] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" Namespace="kube-system" Pod="coredns-674b8bbfcf-mbnfj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mbnfj-eth0" Sep 12 00:37:30.267467 containerd[1615]: time="2025-09-12T00:37:30.267426585Z" level=info msg="connecting to shim 6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4" address="unix:///run/containerd/s/a861762687c54a4ff9892e395715b806afb92ce5ebf5cbbfe43ffa302471eb77" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:30.287422 systemd[1]: Started cri-containerd-6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4.scope - libcontainer container 6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4. Sep 12 00:37:30.298462 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:30.333617 containerd[1615]: time="2025-09-12T00:37:30.333583781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mbnfj,Uid:a27ee4d6-8a7c-4ee2-bfda-cc534bfa5d58,Namespace:kube-system,Attempt:0,} returns sandbox id \"6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4\"" Sep 12 00:37:30.348672 containerd[1615]: time="2025-09-12T00:37:30.348276231Z" level=info msg="CreateContainer within sandbox \"6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 00:37:30.531758 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4018559930.mount: Deactivated successfully. Sep 12 00:37:30.534263 containerd[1615]: time="2025-09-12T00:37:30.534206120Z" level=info msg="Container 73d08a8b5a1aa226bd5479c5020bfb4e99976600be880c068161f7da7b4a3a73: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:30.535824 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1155545786.mount: Deactivated successfully. Sep 12 00:37:30.560140 containerd[1615]: time="2025-09-12T00:37:30.558934819Z" level=info msg="CreateContainer within sandbox \"6ae553f75d90ade6ddb4871133e01f7a9824aa38e6c80d427c8f06d4daaa9ac4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"73d08a8b5a1aa226bd5479c5020bfb4e99976600be880c068161f7da7b4a3a73\"" Sep 12 00:37:30.560774 containerd[1615]: time="2025-09-12T00:37:30.560521447Z" level=info msg="StartContainer for \"73d08a8b5a1aa226bd5479c5020bfb4e99976600be880c068161f7da7b4a3a73\"" Sep 12 00:37:30.561702 containerd[1615]: time="2025-09-12T00:37:30.561670011Z" level=info msg="connecting to shim 73d08a8b5a1aa226bd5479c5020bfb4e99976600be880c068161f7da7b4a3a73" address="unix:///run/containerd/s/a861762687c54a4ff9892e395715b806afb92ce5ebf5cbbfe43ffa302471eb77" protocol=ttrpc version=3 Sep 12 00:37:30.581494 systemd[1]: Started cri-containerd-73d08a8b5a1aa226bd5479c5020bfb4e99976600be880c068161f7da7b4a3a73.scope - libcontainer container 73d08a8b5a1aa226bd5479c5020bfb4e99976600be880c068161f7da7b4a3a73. Sep 12 00:37:30.614642 containerd[1615]: time="2025-09-12T00:37:30.614585797Z" level=info msg="StartContainer for \"73d08a8b5a1aa226bd5479c5020bfb4e99976600be880c068161f7da7b4a3a73\" returns successfully" Sep 12 00:37:30.805409 systemd-networkd[1536]: cali557576bac89: Gained IPv6LL Sep 12 00:37:30.805603 systemd-networkd[1536]: calib9ec9209c8c: Gained IPv6LL Sep 12 00:37:31.070351 containerd[1615]: time="2025-09-12T00:37:31.070157951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4lfjf,Uid:3a3a008e-e1fb-47cc-bd0a-8ff12488e165,Namespace:calico-system,Attempt:0,}" Sep 12 00:37:31.071011 containerd[1615]: time="2025-09-12T00:37:31.070986863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797c9d4dc7-nwcb6,Uid:dc0e806a-7a27-438c-b07b-5454371c66d4,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:37:31.072135 containerd[1615]: time="2025-09-12T00:37:31.071123252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8665c66cb-hjgv7,Uid:9e9d3bf8-503a-4ac7-8753-ceaf23928a49,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:37:31.241875 systemd-networkd[1536]: calid83a1f8a1af: Link UP Sep 12 00:37:31.243497 systemd-networkd[1536]: calid83a1f8a1af: Gained carrier Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.141 [INFO][4647] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0 calico-apiserver-797c9d4dc7- calico-apiserver dc0e806a-7a27-438c-b07b-5454371c66d4 834 0 2025-09-12 00:37:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:797c9d4dc7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-797c9d4dc7-nwcb6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid83a1f8a1af [] [] }} ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-nwcb6" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.141 [INFO][4647] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-nwcb6" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.157 [INFO][4659] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.157 [INFO][4659] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f010), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-797c9d4dc7-nwcb6", "timestamp":"2025-09-12 00:37:31.157440711 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.157 [INFO][4659] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.157 [INFO][4659] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.157 [INFO][4659] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.180 [INFO][4659] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.183 [INFO][4659] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.186 [INFO][4659] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.187 [INFO][4659] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.188 [INFO][4659] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.188 [INFO][4659] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.188 [INFO][4659] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.197 [INFO][4659] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.228 [INFO][4659] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.230 [INFO][4659] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" host="localhost" Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.230 [INFO][4659] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:31.273607 containerd[1615]: 2025-09-12 00:37:31.230 [INFO][4659] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:37:31.292938 containerd[1615]: 2025-09-12 00:37:31.235 [INFO][4647] cni-plugin/k8s.go 418: Populated endpoint ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-nwcb6" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0", GenerateName:"calico-apiserver-797c9d4dc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"dc0e806a-7a27-438c-b07b-5454371c66d4", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"797c9d4dc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-797c9d4dc7-nwcb6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid83a1f8a1af", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:31.292938 containerd[1615]: 2025-09-12 00:37:31.235 [INFO][4647] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-nwcb6" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:37:31.292938 containerd[1615]: 2025-09-12 00:37:31.235 [INFO][4647] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid83a1f8a1af ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-nwcb6" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:37:31.292938 containerd[1615]: 2025-09-12 00:37:31.244 [INFO][4647] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-nwcb6" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:37:31.292938 containerd[1615]: 2025-09-12 00:37:31.244 [INFO][4647] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-nwcb6" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0", GenerateName:"calico-apiserver-797c9d4dc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"dc0e806a-7a27-438c-b07b-5454371c66d4", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"797c9d4dc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f", Pod:"calico-apiserver-797c9d4dc7-nwcb6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid83a1f8a1af", MAC:"ee:2a:33:a9:ee:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:31.292938 containerd[1615]: 2025-09-12 00:37:31.269 [INFO][4647] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-nwcb6" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:37:31.317626 systemd-networkd[1536]: cali88ac5a6c44c: Gained IPv6LL Sep 12 00:37:31.331371 systemd-networkd[1536]: vxlan.calico: Link UP Sep 12 00:37:31.331378 systemd-networkd[1536]: vxlan.calico: Gained carrier Sep 12 00:37:31.367271 containerd[1615]: time="2025-09-12T00:37:31.366974877Z" level=info msg="connecting to shim 64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" address="unix:///run/containerd/s/8d6af139c11f98219a19e06b21a1e05d66a87e4a2df42f253b09304ddf215585" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:31.403599 systemd[1]: Started cri-containerd-64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f.scope - libcontainer container 64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f. Sep 12 00:37:31.423750 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:31.444763 systemd-networkd[1536]: cali7ace0f85193: Link UP Sep 12 00:37:31.445427 systemd-networkd[1536]: cali7ace0f85193: Gained carrier Sep 12 00:37:31.468937 containerd[1615]: time="2025-09-12T00:37:31.468703772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797c9d4dc7-nwcb6,Uid:dc0e806a-7a27-438c-b07b-5454371c66d4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\"" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.276 [INFO][4666] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--4lfjf-eth0 csi-node-driver- calico-system 3a3a008e-e1fb-47cc-bd0a-8ff12488e165 726 0 2025-09-12 00:37:05 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-4lfjf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7ace0f85193 [] [] }} ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Namespace="calico-system" Pod="csi-node-driver-4lfjf" WorkloadEndpoint="localhost-k8s-csi--node--driver--4lfjf-" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.276 [INFO][4666] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Namespace="calico-system" Pod="csi-node-driver-4lfjf" WorkloadEndpoint="localhost-k8s-csi--node--driver--4lfjf-eth0" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.394 [INFO][4698] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" HandleID="k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Workload="localhost-k8s-csi--node--driver--4lfjf-eth0" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.394 [INFO][4698] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" HandleID="k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Workload="localhost-k8s-csi--node--driver--4lfjf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f680), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-4lfjf", "timestamp":"2025-09-12 00:37:31.393573569 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.394 [INFO][4698] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.394 [INFO][4698] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.394 [INFO][4698] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.402 [INFO][4698] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.406 [INFO][4698] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.411 [INFO][4698] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.412 [INFO][4698] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.414 [INFO][4698] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.414 [INFO][4698] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.415 [INFO][4698] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9 Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.419 [INFO][4698] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.438 [INFO][4698] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.438 [INFO][4698] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" host="localhost" Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.438 [INFO][4698] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:31.488840 containerd[1615]: 2025-09-12 00:37:31.438 [INFO][4698] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" HandleID="k8s-pod-network.9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Workload="localhost-k8s-csi--node--driver--4lfjf-eth0" Sep 12 00:37:31.489833 containerd[1615]: 2025-09-12 00:37:31.441 [INFO][4666] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Namespace="calico-system" Pod="csi-node-driver-4lfjf" WorkloadEndpoint="localhost-k8s-csi--node--driver--4lfjf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--4lfjf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3a3a008e-e1fb-47cc-bd0a-8ff12488e165", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-4lfjf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7ace0f85193", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:31.489833 containerd[1615]: 2025-09-12 00:37:31.441 [INFO][4666] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Namespace="calico-system" Pod="csi-node-driver-4lfjf" WorkloadEndpoint="localhost-k8s-csi--node--driver--4lfjf-eth0" Sep 12 00:37:31.489833 containerd[1615]: 2025-09-12 00:37:31.441 [INFO][4666] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7ace0f85193 ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Namespace="calico-system" Pod="csi-node-driver-4lfjf" WorkloadEndpoint="localhost-k8s-csi--node--driver--4lfjf-eth0" Sep 12 00:37:31.489833 containerd[1615]: 2025-09-12 00:37:31.446 [INFO][4666] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Namespace="calico-system" Pod="csi-node-driver-4lfjf" WorkloadEndpoint="localhost-k8s-csi--node--driver--4lfjf-eth0" Sep 12 00:37:31.489833 containerd[1615]: 2025-09-12 00:37:31.447 [INFO][4666] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Namespace="calico-system" Pod="csi-node-driver-4lfjf" WorkloadEndpoint="localhost-k8s-csi--node--driver--4lfjf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--4lfjf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3a3a008e-e1fb-47cc-bd0a-8ff12488e165", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9", Pod:"csi-node-driver-4lfjf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7ace0f85193", MAC:"da:36:82:26:63:8b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:31.489833 containerd[1615]: 2025-09-12 00:37:31.472 [INFO][4666] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" Namespace="calico-system" Pod="csi-node-driver-4lfjf" WorkloadEndpoint="localhost-k8s-csi--node--driver--4lfjf-eth0" Sep 12 00:37:31.552820 containerd[1615]: time="2025-09-12T00:37:31.550719947Z" level=info msg="connecting to shim 9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9" address="unix:///run/containerd/s/e7c9901e8ec75ccb9899fbb3670ca08ff2d1b6f900107d66bd656a79dac3406a" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:31.633397 systemd[1]: Started cri-containerd-9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9.scope - libcontainer container 9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9. Sep 12 00:37:31.645971 systemd-networkd[1536]: cali76799d4d6e7: Link UP Sep 12 00:37:31.646544 systemd-networkd[1536]: cali76799d4d6e7: Gained carrier Sep 12 00:37:31.661369 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.362 [INFO][4680] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0 calico-apiserver-8665c66cb- calico-apiserver 9e9d3bf8-503a-4ac7-8753-ceaf23928a49 835 0 2025-09-12 00:37:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8665c66cb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8665c66cb-hjgv7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali76799d4d6e7 [] [] }} ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-hjgv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.362 [INFO][4680] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-hjgv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.406 [INFO][4727] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" HandleID="k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Workload="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.407 [INFO][4727] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" HandleID="k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Workload="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5730), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8665c66cb-hjgv7", "timestamp":"2025-09-12 00:37:31.406826134 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.407 [INFO][4727] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.438 [INFO][4727] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.438 [INFO][4727] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.507 [INFO][4727] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.526 [INFO][4727] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.536 [INFO][4727] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.543 [INFO][4727] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.550 [INFO][4727] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.552 [INFO][4727] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.558 [INFO][4727] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.570 [INFO][4727] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.639 [INFO][4727] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.639 [INFO][4727] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" host="localhost" Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.639 [INFO][4727] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:31.683478 containerd[1615]: 2025-09-12 00:37:31.639 [INFO][4727] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" HandleID="k8s-pod-network.29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Workload="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" Sep 12 00:37:31.713999 containerd[1615]: 2025-09-12 00:37:31.644 [INFO][4680] cni-plugin/k8s.go 418: Populated endpoint ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-hjgv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0", GenerateName:"calico-apiserver-8665c66cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"9e9d3bf8-503a-4ac7-8753-ceaf23928a49", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8665c66cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8665c66cb-hjgv7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali76799d4d6e7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:31.713999 containerd[1615]: 2025-09-12 00:37:31.644 [INFO][4680] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-hjgv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" Sep 12 00:37:31.713999 containerd[1615]: 2025-09-12 00:37:31.644 [INFO][4680] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76799d4d6e7 ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-hjgv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" Sep 12 00:37:31.713999 containerd[1615]: 2025-09-12 00:37:31.647 [INFO][4680] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-hjgv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" Sep 12 00:37:31.713999 containerd[1615]: 2025-09-12 00:37:31.647 [INFO][4680] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-hjgv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0", GenerateName:"calico-apiserver-8665c66cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"9e9d3bf8-503a-4ac7-8753-ceaf23928a49", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8665c66cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c", Pod:"calico-apiserver-8665c66cb-hjgv7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali76799d4d6e7", MAC:"4a:16:53:57:59:23", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:31.713999 containerd[1615]: 2025-09-12 00:37:31.680 [INFO][4680] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-hjgv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--hjgv7-eth0" Sep 12 00:37:31.928621 kubelet[2932]: I0912 00:37:31.928011 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-mbnfj" podStartSLOduration=39.927996031 podStartE2EDuration="39.927996031s" podCreationTimestamp="2025-09-12 00:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:37:31.927978561 +0000 UTC m=+46.959400780" watchObservedRunningTime="2025-09-12 00:37:31.927996031 +0000 UTC m=+46.959418251" Sep 12 00:37:32.070848 containerd[1615]: time="2025-09-12T00:37:32.070801826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797c9d4dc7-9jk4g,Uid:5cff39d8-e5be-4f52-9e94-587e9399478a,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:37:32.070955 containerd[1615]: time="2025-09-12T00:37:32.070943531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tkh46,Uid:72ef27f4-5e10-4386-b23d-7206e0e9d085,Namespace:kube-system,Attempt:0,}" Sep 12 00:37:32.522419 containerd[1615]: time="2025-09-12T00:37:32.521073073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4lfjf,Uid:3a3a008e-e1fb-47cc-bd0a-8ff12488e165,Namespace:calico-system,Attempt:0,} returns sandbox id \"9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9\"" Sep 12 00:37:32.580872 containerd[1615]: time="2025-09-12T00:37:32.580594121Z" level=info msg="connecting to shim 29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c" address="unix:///run/containerd/s/9dd49c3a416949d09810d51f905974dbf27e9f9a391a1318fe1a954d6517a887" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:32.628425 systemd[1]: Started cri-containerd-29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c.scope - libcontainer container 29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c. Sep 12 00:37:32.647374 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:32.699132 containerd[1615]: time="2025-09-12T00:37:32.698474779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8665c66cb-hjgv7,Uid:9e9d3bf8-503a-4ac7-8753-ceaf23928a49,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c\"" Sep 12 00:37:32.726680 systemd-networkd[1536]: vxlan.calico: Gained IPv6LL Sep 12 00:37:32.776407 systemd-networkd[1536]: calib7063e8135c: Link UP Sep 12 00:37:32.776606 systemd-networkd[1536]: calib7063e8135c: Gained carrier Sep 12 00:37:32.789491 systemd-networkd[1536]: calid83a1f8a1af: Gained IPv6LL Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.635 [INFO][4885] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--tkh46-eth0 coredns-674b8bbfcf- kube-system 72ef27f4-5e10-4386-b23d-7206e0e9d085 832 0 2025-09-12 00:36:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-tkh46 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib7063e8135c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Namespace="kube-system" Pod="coredns-674b8bbfcf-tkh46" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--tkh46-" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.635 [INFO][4885] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Namespace="kube-system" Pod="coredns-674b8bbfcf-tkh46" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.674 [INFO][4947] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" HandleID="k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Workload="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.674 [INFO][4947] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" HandleID="k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Workload="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5030), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-tkh46", "timestamp":"2025-09-12 00:37:32.673096678 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.674 [INFO][4947] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.674 [INFO][4947] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.674 [INFO][4947] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.701 [INFO][4947] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.718 [INFO][4947] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.731 [INFO][4947] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.736 [INFO][4947] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.741 [INFO][4947] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.741 [INFO][4947] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.742 [INFO][4947] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.748 [INFO][4947] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.755 [INFO][4947] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.755 [INFO][4947] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" host="localhost" Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.756 [INFO][4947] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:32.822907 containerd[1615]: 2025-09-12 00:37:32.756 [INFO][4947] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" HandleID="k8s-pod-network.aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Workload="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" Sep 12 00:37:32.823875 containerd[1615]: 2025-09-12 00:37:32.761 [INFO][4885] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Namespace="kube-system" Pod="coredns-674b8bbfcf-tkh46" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--tkh46-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"72ef27f4-5e10-4386-b23d-7206e0e9d085", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 36, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-tkh46", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib7063e8135c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:32.823875 containerd[1615]: 2025-09-12 00:37:32.762 [INFO][4885] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Namespace="kube-system" Pod="coredns-674b8bbfcf-tkh46" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" Sep 12 00:37:32.823875 containerd[1615]: 2025-09-12 00:37:32.762 [INFO][4885] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib7063e8135c ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Namespace="kube-system" Pod="coredns-674b8bbfcf-tkh46" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" Sep 12 00:37:32.823875 containerd[1615]: 2025-09-12 00:37:32.777 [INFO][4885] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Namespace="kube-system" Pod="coredns-674b8bbfcf-tkh46" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" Sep 12 00:37:32.823875 containerd[1615]: 2025-09-12 00:37:32.777 [INFO][4885] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Namespace="kube-system" Pod="coredns-674b8bbfcf-tkh46" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--tkh46-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"72ef27f4-5e10-4386-b23d-7206e0e9d085", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 36, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b", Pod:"coredns-674b8bbfcf-tkh46", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib7063e8135c", MAC:"d6:86:e0:fb:6c:4c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:32.823875 containerd[1615]: 2025-09-12 00:37:32.808 [INFO][4885] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" Namespace="kube-system" Pod="coredns-674b8bbfcf-tkh46" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--tkh46-eth0" Sep 12 00:37:32.989403 systemd-networkd[1536]: calief4eafbfdc1: Link UP Sep 12 00:37:32.990051 systemd-networkd[1536]: calief4eafbfdc1: Gained carrier Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.635 [INFO][4896] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0 calico-apiserver-797c9d4dc7- calico-apiserver 5cff39d8-e5be-4f52-9e94-587e9399478a 837 0 2025-09-12 00:37:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:797c9d4dc7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-797c9d4dc7-9jk4g eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calief4eafbfdc1 [] [] }} ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-9jk4g" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.635 [INFO][4896] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-9jk4g" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.676 [INFO][4949] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.676 [INFO][4949] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-797c9d4dc7-9jk4g", "timestamp":"2025-09-12 00:37:32.676812875 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.677 [INFO][4949] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.756 [INFO][4949] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.756 [INFO][4949] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.800 [INFO][4949] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.841 [INFO][4949] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.845 [INFO][4949] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.848 [INFO][4949] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.862 [INFO][4949] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.862 [INFO][4949] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.863 [INFO][4949] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14 Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.903 [INFO][4949] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.982 [INFO][4949] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.982 [INFO][4949] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" host="localhost" Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.982 [INFO][4949] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:37:33.019850 containerd[1615]: 2025-09-12 00:37:32.982 [INFO][4949] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:37:33.036334 containerd[1615]: 2025-09-12 00:37:32.985 [INFO][4896] cni-plugin/k8s.go 418: Populated endpoint ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-9jk4g" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0", GenerateName:"calico-apiserver-797c9d4dc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"5cff39d8-e5be-4f52-9e94-587e9399478a", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"797c9d4dc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-797c9d4dc7-9jk4g", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calief4eafbfdc1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:33.036334 containerd[1615]: 2025-09-12 00:37:32.985 [INFO][4896] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-9jk4g" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:37:33.036334 containerd[1615]: 2025-09-12 00:37:32.985 [INFO][4896] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calief4eafbfdc1 ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-9jk4g" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:37:33.036334 containerd[1615]: 2025-09-12 00:37:32.990 [INFO][4896] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-9jk4g" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:37:33.036334 containerd[1615]: 2025-09-12 00:37:32.990 [INFO][4896] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-9jk4g" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0", GenerateName:"calico-apiserver-797c9d4dc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"5cff39d8-e5be-4f52-9e94-587e9399478a", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 37, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"797c9d4dc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14", Pod:"calico-apiserver-797c9d4dc7-9jk4g", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calief4eafbfdc1", MAC:"9e:9f:c1:88:01:7d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:37:33.036334 containerd[1615]: 2025-09-12 00:37:33.016 [INFO][4896] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Namespace="calico-apiserver" Pod="calico-apiserver-797c9d4dc7-9jk4g" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:37:33.046339 systemd-networkd[1536]: cali76799d4d6e7: Gained IPv6LL Sep 12 00:37:33.109546 systemd-networkd[1536]: cali7ace0f85193: Gained IPv6LL Sep 12 00:37:33.601305 containerd[1615]: time="2025-09-12T00:37:33.600678259Z" level=info msg="connecting to shim aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b" address="unix:///run/containerd/s/ab3c18e91a418de0d97f5497440f403661b493c6f1818296b45b515c2920ce0c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:33.631423 systemd[1]: Started cri-containerd-aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b.scope - libcontainer container aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b. Sep 12 00:37:33.649898 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:33.696763 containerd[1615]: time="2025-09-12T00:37:33.696712921Z" level=info msg="connecting to shim 78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" address="unix:///run/containerd/s/c8603fd455de2a75bb65c2bb9d4992cd76c6db4d6d3196064434f2558efe3598" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:37:33.723709 systemd[1]: Started cri-containerd-78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14.scope - libcontainer container 78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14. Sep 12 00:37:33.735817 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:37:33.749454 containerd[1615]: time="2025-09-12T00:37:33.749420548Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tkh46,Uid:72ef27f4-5e10-4386-b23d-7206e0e9d085,Namespace:kube-system,Attempt:0,} returns sandbox id \"aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b\"" Sep 12 00:37:33.797692 containerd[1615]: time="2025-09-12T00:37:33.797608676Z" level=info msg="CreateContainer within sandbox \"aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 00:37:33.798725 containerd[1615]: time="2025-09-12T00:37:33.798409605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797c9d4dc7-9jk4g,Uid:5cff39d8-e5be-4f52-9e94-587e9399478a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\"" Sep 12 00:37:33.830802 containerd[1615]: time="2025-09-12T00:37:33.830761224Z" level=info msg="Container 57b09de106df2b6e0a298066abc2e3664795b3d5941b69febd1ff9b6ba39c4a8: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:33.891068 containerd[1615]: time="2025-09-12T00:37:33.890997014Z" level=info msg="CreateContainer within sandbox \"aabdc3eec9789195ece72a0923d305dfb6058c78110b11845760bf7c55c7eb4b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"57b09de106df2b6e0a298066abc2e3664795b3d5941b69febd1ff9b6ba39c4a8\"" Sep 12 00:37:33.892402 containerd[1615]: time="2025-09-12T00:37:33.892380409Z" level=info msg="StartContainer for \"57b09de106df2b6e0a298066abc2e3664795b3d5941b69febd1ff9b6ba39c4a8\"" Sep 12 00:37:33.894374 containerd[1615]: time="2025-09-12T00:37:33.894353330Z" level=info msg="connecting to shim 57b09de106df2b6e0a298066abc2e3664795b3d5941b69febd1ff9b6ba39c4a8" address="unix:///run/containerd/s/ab3c18e91a418de0d97f5497440f403661b493c6f1818296b45b515c2920ce0c" protocol=ttrpc version=3 Sep 12 00:37:33.927610 systemd[1]: Started cri-containerd-57b09de106df2b6e0a298066abc2e3664795b3d5941b69febd1ff9b6ba39c4a8.scope - libcontainer container 57b09de106df2b6e0a298066abc2e3664795b3d5941b69febd1ff9b6ba39c4a8. Sep 12 00:37:33.985671 containerd[1615]: time="2025-09-12T00:37:33.985481666Z" level=info msg="StartContainer for \"57b09de106df2b6e0a298066abc2e3664795b3d5941b69febd1ff9b6ba39c4a8\" returns successfully" Sep 12 00:37:34.069623 systemd-networkd[1536]: calib7063e8135c: Gained IPv6LL Sep 12 00:37:34.583911 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount422537655.mount: Deactivated successfully. Sep 12 00:37:34.755029 kubelet[2932]: I0912 00:37:34.753668 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-tkh46" podStartSLOduration=42.753561768 podStartE2EDuration="42.753561768s" podCreationTimestamp="2025-09-12 00:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:37:34.752801375 +0000 UTC m=+49.784223594" watchObservedRunningTime="2025-09-12 00:37:34.753561768 +0000 UTC m=+49.784983987" Sep 12 00:37:34.837393 systemd-networkd[1536]: calief4eafbfdc1: Gained IPv6LL Sep 12 00:37:36.759555 containerd[1615]: time="2025-09-12T00:37:36.759513085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:36.802524 containerd[1615]: time="2025-09-12T00:37:36.802418633Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 00:37:36.957264 containerd[1615]: time="2025-09-12T00:37:36.956620669Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:37.113715 containerd[1615]: time="2025-09-12T00:37:37.113488999Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:37.116764 containerd[1615]: time="2025-09-12T00:37:37.116742955Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 7.054574285s" Sep 12 00:37:37.116882 containerd[1615]: time="2025-09-12T00:37:37.116870101Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 00:37:37.118174 containerd[1615]: time="2025-09-12T00:37:37.118161579Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 00:37:37.392773 containerd[1615]: time="2025-09-12T00:37:37.392517736Z" level=info msg="CreateContainer within sandbox \"2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 00:37:37.874466 containerd[1615]: time="2025-09-12T00:37:37.874399832Z" level=info msg="Container c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:37.906115 containerd[1615]: time="2025-09-12T00:37:37.905988789Z" level=info msg="CreateContainer within sandbox \"2be6456b4f7afdace3193046273582d7fe4e96fa4f1a1fc2b760e37e2cae8332\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\"" Sep 12 00:37:37.907033 containerd[1615]: time="2025-09-12T00:37:37.907004898Z" level=info msg="StartContainer for \"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\"" Sep 12 00:37:37.907903 containerd[1615]: time="2025-09-12T00:37:37.907858722Z" level=info msg="connecting to shim c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43" address="unix:///run/containerd/s/3d7c4987941d48c8e2801b8b5582ce845a1f5e6f8ff9baf52011b2362db71cd9" protocol=ttrpc version=3 Sep 12 00:37:38.032343 systemd[1]: Started cri-containerd-c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43.scope - libcontainer container c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43. Sep 12 00:37:38.095863 containerd[1615]: time="2025-09-12T00:37:38.095835640Z" level=info msg="StartContainer for \"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\" returns successfully" Sep 12 00:37:39.008710 containerd[1615]: time="2025-09-12T00:37:39.008680979Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\" id:\"df4d1dd2d2c3b3c291c262ec4433efb3216ea78ec912e3c6320f953abc4abce0\" pid:5194 exit_status:1 exited_at:{seconds:1757637459 nanos:8161375}" Sep 12 00:37:39.914578 containerd[1615]: time="2025-09-12T00:37:39.914390159Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\" id:\"195d947e47f4a340108df3502bf58500a483a6babea6ff75b868ce702ec9a250\" pid:5216 exit_status:1 exited_at:{seconds:1757637459 nanos:914196346}" Sep 12 00:37:40.185915 containerd[1615]: time="2025-09-12T00:37:40.185800850Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\" id:\"8d41a07f882713e91e3e984a22b4482cc905df11ab3e0505df569675f81260e5\" pid:5238 exited_at:{seconds:1757637460 nanos:185513998}" Sep 12 00:37:40.906723 containerd[1615]: time="2025-09-12T00:37:40.906686360Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\" id:\"4404eefc6bfc66e222054471f3cdacb68e05e8f91fb1b8b2d28d2cbb545f0e8d\" pid:5260 exit_status:1 exited_at:{seconds:1757637460 nanos:906340840}" Sep 12 00:37:43.270302 containerd[1615]: time="2025-09-12T00:37:43.270118265Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:43.323471 containerd[1615]: time="2025-09-12T00:37:43.323430897Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 00:37:43.371725 containerd[1615]: time="2025-09-12T00:37:43.371664852Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:43.480128 containerd[1615]: time="2025-09-12T00:37:43.480078431Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:43.481237 containerd[1615]: time="2025-09-12T00:37:43.481192910Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 6.362795443s" Sep 12 00:37:43.481481 containerd[1615]: time="2025-09-12T00:37:43.481226614Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 00:37:43.482363 containerd[1615]: time="2025-09-12T00:37:43.482347602Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 00:37:43.906960 containerd[1615]: time="2025-09-12T00:37:43.906930438Z" level=info msg="CreateContainer within sandbox \"1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 00:37:43.976452 containerd[1615]: time="2025-09-12T00:37:43.976419099Z" level=info msg="Container f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:43.980788 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1016396144.mount: Deactivated successfully. Sep 12 00:37:43.994057 containerd[1615]: time="2025-09-12T00:37:43.994024657Z" level=info msg="CreateContainer within sandbox \"1e745d82f384188e6271d51aae406bbe20f573218d351b8ea956787140f91fe8\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f\"" Sep 12 00:37:43.995560 containerd[1615]: time="2025-09-12T00:37:43.995096612Z" level=info msg="StartContainer for \"f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f\"" Sep 12 00:37:43.995923 containerd[1615]: time="2025-09-12T00:37:43.995901256Z" level=info msg="connecting to shim f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f" address="unix:///run/containerd/s/831f4d9b4583844d36fd412ab2c598ef94618f8585238df504b3fa17f61bd13c" protocol=ttrpc version=3 Sep 12 00:37:44.021405 systemd[1]: Started cri-containerd-f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f.scope - libcontainer container f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f. Sep 12 00:37:44.077428 containerd[1615]: time="2025-09-12T00:37:44.077376811Z" level=info msg="StartContainer for \"f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f\" returns successfully" Sep 12 00:37:44.868622 kubelet[2932]: I0912 00:37:44.867903 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-bjqdk" podStartSLOduration=33.218271253 podStartE2EDuration="40.867886634s" podCreationTimestamp="2025-09-12 00:37:04 +0000 UTC" firstStartedPulling="2025-09-12 00:37:29.468447997 +0000 UTC m=+44.499870206" lastFinishedPulling="2025-09-12 00:37:37.118063379 +0000 UTC m=+52.149485587" observedRunningTime="2025-09-12 00:37:38.86126724 +0000 UTC m=+53.892689450" watchObservedRunningTime="2025-09-12 00:37:44.867886634 +0000 UTC m=+59.899308849" Sep 12 00:37:44.868622 kubelet[2932]: I0912 00:37:44.868007 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7585d978b7-fk9q8" podStartSLOduration=25.965333433 podStartE2EDuration="39.868002578s" podCreationTimestamp="2025-09-12 00:37:05 +0000 UTC" firstStartedPulling="2025-09-12 00:37:29.579461373 +0000 UTC m=+44.610883583" lastFinishedPulling="2025-09-12 00:37:43.482130516 +0000 UTC m=+58.513552728" observedRunningTime="2025-09-12 00:37:44.866422492 +0000 UTC m=+59.897844739" watchObservedRunningTime="2025-09-12 00:37:44.868002578 +0000 UTC m=+59.899424799" Sep 12 00:37:44.905599 containerd[1615]: time="2025-09-12T00:37:44.905560134Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f\" id:\"2df061ff879a6f2058d2c08b66928675c82283f3e9be5e9193f540aa175416f7\" pid:5339 exited_at:{seconds:1757637464 nanos:905010298}" Sep 12 00:37:46.023736 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount441417454.mount: Deactivated successfully. Sep 12 00:37:46.088606 containerd[1615]: time="2025-09-12T00:37:46.088566342Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:46.089701 containerd[1615]: time="2025-09-12T00:37:46.089678880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 00:37:46.092577 containerd[1615]: time="2025-09-12T00:37:46.090240085Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:46.093006 containerd[1615]: time="2025-09-12T00:37:46.092987679Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:46.093801 containerd[1615]: time="2025-09-12T00:37:46.093721448Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.611292754s" Sep 12 00:37:46.093801 containerd[1615]: time="2025-09-12T00:37:46.093741365Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 00:37:46.107939 containerd[1615]: time="2025-09-12T00:37:46.107904555Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 00:37:46.139312 containerd[1615]: time="2025-09-12T00:37:46.139239460Z" level=info msg="CreateContainer within sandbox \"d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 00:37:46.170911 containerd[1615]: time="2025-09-12T00:37:46.164632144Z" level=info msg="Container 30c1d833c43e108a761942144bb630176e6b7504c52e126ea65dd8bf916e9d6b: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:46.176991 containerd[1615]: time="2025-09-12T00:37:46.176844910Z" level=info msg="CreateContainer within sandbox \"d36afea7379dc7c98a6976352c998efe65ee2bde47f3baf8d7974f5c5eb05903\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"30c1d833c43e108a761942144bb630176e6b7504c52e126ea65dd8bf916e9d6b\"" Sep 12 00:37:46.178275 containerd[1615]: time="2025-09-12T00:37:46.177879531Z" level=info msg="StartContainer for \"30c1d833c43e108a761942144bb630176e6b7504c52e126ea65dd8bf916e9d6b\"" Sep 12 00:37:46.179319 containerd[1615]: time="2025-09-12T00:37:46.179065753Z" level=info msg="connecting to shim 30c1d833c43e108a761942144bb630176e6b7504c52e126ea65dd8bf916e9d6b" address="unix:///run/containerd/s/2529c0a702568fa20af64464287d9805524c4264acc82f81760cab23ae711272" protocol=ttrpc version=3 Sep 12 00:37:46.225419 systemd[1]: Started cri-containerd-30c1d833c43e108a761942144bb630176e6b7504c52e126ea65dd8bf916e9d6b.scope - libcontainer container 30c1d833c43e108a761942144bb630176e6b7504c52e126ea65dd8bf916e9d6b. Sep 12 00:37:46.315013 containerd[1615]: time="2025-09-12T00:37:46.314501443Z" level=info msg="StartContainer for \"30c1d833c43e108a761942144bb630176e6b7504c52e126ea65dd8bf916e9d6b\" returns successfully" Sep 12 00:37:49.798235 containerd[1615]: time="2025-09-12T00:37:49.798188595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:49.803265 containerd[1615]: time="2025-09-12T00:37:49.802769104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 00:37:49.815863 containerd[1615]: time="2025-09-12T00:37:49.815820552Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:49.829954 containerd[1615]: time="2025-09-12T00:37:49.829618733Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:49.829954 containerd[1615]: time="2025-09-12T00:37:49.829852197Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.721923391s" Sep 12 00:37:49.829954 containerd[1615]: time="2025-09-12T00:37:49.829870821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 00:37:49.833495 containerd[1615]: time="2025-09-12T00:37:49.833479847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 00:37:49.849556 containerd[1615]: time="2025-09-12T00:37:49.849532651Z" level=info msg="CreateContainer within sandbox \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 00:37:49.860210 containerd[1615]: time="2025-09-12T00:37:49.860187360Z" level=info msg="Container 77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:49.866737 containerd[1615]: time="2025-09-12T00:37:49.866709927Z" level=info msg="CreateContainer within sandbox \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\"" Sep 12 00:37:49.867615 containerd[1615]: time="2025-09-12T00:37:49.867602815Z" level=info msg="StartContainer for \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\"" Sep 12 00:37:49.868863 containerd[1615]: time="2025-09-12T00:37:49.868476618Z" level=info msg="connecting to shim 77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052" address="unix:///run/containerd/s/8d6af139c11f98219a19e06b21a1e05d66a87e4a2df42f253b09304ddf215585" protocol=ttrpc version=3 Sep 12 00:37:49.893390 systemd[1]: Started cri-containerd-77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052.scope - libcontainer container 77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052. Sep 12 00:37:49.962239 containerd[1615]: time="2025-09-12T00:37:49.962213191Z" level=info msg="StartContainer for \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" returns successfully" Sep 12 00:37:50.973730 kubelet[2932]: I0912 00:37:50.969121 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-797c9d4dc7-nwcb6" podStartSLOduration=30.594129117 podStartE2EDuration="48.953119028s" podCreationTimestamp="2025-09-12 00:37:02 +0000 UTC" firstStartedPulling="2025-09-12 00:37:31.471462614 +0000 UTC m=+46.502884827" lastFinishedPulling="2025-09-12 00:37:49.83045253 +0000 UTC m=+64.861874738" observedRunningTime="2025-09-12 00:37:50.946254591 +0000 UTC m=+65.977676801" watchObservedRunningTime="2025-09-12 00:37:50.953119028 +0000 UTC m=+65.984541246" Sep 12 00:37:50.973730 kubelet[2932]: I0912 00:37:50.973536 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6d4d9c456b-k8224" podStartSLOduration=7.258202586 podStartE2EDuration="24.973522959s" podCreationTimestamp="2025-09-12 00:37:26 +0000 UTC" firstStartedPulling="2025-09-12 00:37:28.385924333 +0000 UTC m=+43.417346542" lastFinishedPulling="2025-09-12 00:37:46.101244704 +0000 UTC m=+61.132666915" observedRunningTime="2025-09-12 00:37:46.929844376 +0000 UTC m=+61.961266595" watchObservedRunningTime="2025-09-12 00:37:50.973522959 +0000 UTC m=+66.004945172" Sep 12 00:37:51.408429 containerd[1615]: time="2025-09-12T00:37:51.408398980Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:51.409775 containerd[1615]: time="2025-09-12T00:37:51.409758470Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 00:37:51.410230 containerd[1615]: time="2025-09-12T00:37:51.410212011Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:51.411187 containerd[1615]: time="2025-09-12T00:37:51.411170197Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:51.411747 containerd[1615]: time="2025-09-12T00:37:51.411726524Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.578231716s" Sep 12 00:37:51.411747 containerd[1615]: time="2025-09-12T00:37:51.411746267Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 00:37:51.412442 containerd[1615]: time="2025-09-12T00:37:51.412422833Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 00:37:51.451858 containerd[1615]: time="2025-09-12T00:37:51.451829573Z" level=info msg="CreateContainer within sandbox \"9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 00:37:51.486466 containerd[1615]: time="2025-09-12T00:37:51.485473356Z" level=info msg="Container 4160a06f3e63f9314b709333420ce4867948ae99545eec03714a58fdd8a52d02: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:51.487583 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount713756347.mount: Deactivated successfully. Sep 12 00:37:51.506748 containerd[1615]: time="2025-09-12T00:37:51.506718678Z" level=info msg="CreateContainer within sandbox \"9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"4160a06f3e63f9314b709333420ce4867948ae99545eec03714a58fdd8a52d02\"" Sep 12 00:37:51.507331 containerd[1615]: time="2025-09-12T00:37:51.507299578Z" level=info msg="StartContainer for \"4160a06f3e63f9314b709333420ce4867948ae99545eec03714a58fdd8a52d02\"" Sep 12 00:37:51.509736 containerd[1615]: time="2025-09-12T00:37:51.509617079Z" level=info msg="connecting to shim 4160a06f3e63f9314b709333420ce4867948ae99545eec03714a58fdd8a52d02" address="unix:///run/containerd/s/e7c9901e8ec75ccb9899fbb3670ca08ff2d1b6f900107d66bd656a79dac3406a" protocol=ttrpc version=3 Sep 12 00:37:51.534442 systemd[1]: Started cri-containerd-4160a06f3e63f9314b709333420ce4867948ae99545eec03714a58fdd8a52d02.scope - libcontainer container 4160a06f3e63f9314b709333420ce4867948ae99545eec03714a58fdd8a52d02. Sep 12 00:37:51.570006 containerd[1615]: time="2025-09-12T00:37:51.569884132Z" level=info msg="StartContainer for \"4160a06f3e63f9314b709333420ce4867948ae99545eec03714a58fdd8a52d02\" returns successfully" Sep 12 00:37:51.823082 containerd[1615]: time="2025-09-12T00:37:51.823042933Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:51.823583 containerd[1615]: time="2025-09-12T00:37:51.823559091Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 00:37:51.824952 containerd[1615]: time="2025-09-12T00:37:51.824937727Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 412.430753ms" Sep 12 00:37:51.825023 containerd[1615]: time="2025-09-12T00:37:51.825013535Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 00:37:51.826640 containerd[1615]: time="2025-09-12T00:37:51.826610727Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 00:37:51.829238 containerd[1615]: time="2025-09-12T00:37:51.829196997Z" level=info msg="CreateContainer within sandbox \"29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 00:37:51.841759 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4226460058.mount: Deactivated successfully. Sep 12 00:37:51.842413 containerd[1615]: time="2025-09-12T00:37:51.841950198Z" level=info msg="Container c386a5f8281a34f67a93d2950157ddbd09ae1ead9282fbfc4aecd4e5d7d55024: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:51.851893 containerd[1615]: time="2025-09-12T00:37:51.851871549Z" level=info msg="CreateContainer within sandbox \"29667b4db0c41370edbe8396c08434dccee44b4d634aef9494cf91766226134c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c386a5f8281a34f67a93d2950157ddbd09ae1ead9282fbfc4aecd4e5d7d55024\"" Sep 12 00:37:51.852765 containerd[1615]: time="2025-09-12T00:37:51.852728431Z" level=info msg="StartContainer for \"c386a5f8281a34f67a93d2950157ddbd09ae1ead9282fbfc4aecd4e5d7d55024\"" Sep 12 00:37:51.854029 containerd[1615]: time="2025-09-12T00:37:51.853924834Z" level=info msg="connecting to shim c386a5f8281a34f67a93d2950157ddbd09ae1ead9282fbfc4aecd4e5d7d55024" address="unix:///run/containerd/s/9dd49c3a416949d09810d51f905974dbf27e9f9a391a1318fe1a954d6517a887" protocol=ttrpc version=3 Sep 12 00:37:51.868390 systemd[1]: Started cri-containerd-c386a5f8281a34f67a93d2950157ddbd09ae1ead9282fbfc4aecd4e5d7d55024.scope - libcontainer container c386a5f8281a34f67a93d2950157ddbd09ae1ead9282fbfc4aecd4e5d7d55024. Sep 12 00:37:51.913359 containerd[1615]: time="2025-09-12T00:37:51.913294534Z" level=info msg="StartContainer for \"c386a5f8281a34f67a93d2950157ddbd09ae1ead9282fbfc4aecd4e5d7d55024\" returns successfully" Sep 12 00:37:52.002440 kubelet[2932]: I0912 00:37:52.002420 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:37:52.213230 containerd[1615]: time="2025-09-12T00:37:52.213153313Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:52.214212 containerd[1615]: time="2025-09-12T00:37:52.213975058Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 00:37:52.217259 containerd[1615]: time="2025-09-12T00:37:52.216021544Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 389.383522ms" Sep 12 00:37:52.217323 containerd[1615]: time="2025-09-12T00:37:52.217278289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 00:37:52.217910 containerd[1615]: time="2025-09-12T00:37:52.217892369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 00:37:52.220861 containerd[1615]: time="2025-09-12T00:37:52.220835287Z" level=info msg="CreateContainer within sandbox \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 00:37:52.223903 containerd[1615]: time="2025-09-12T00:37:52.223878288Z" level=info msg="Container 06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:52.238219 containerd[1615]: time="2025-09-12T00:37:52.238183822Z" level=info msg="CreateContainer within sandbox \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\"" Sep 12 00:37:52.239561 containerd[1615]: time="2025-09-12T00:37:52.239548938Z" level=info msg="StartContainer for \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\"" Sep 12 00:37:52.240321 containerd[1615]: time="2025-09-12T00:37:52.240281055Z" level=info msg="connecting to shim 06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f" address="unix:///run/containerd/s/c8603fd455de2a75bb65c2bb9d4992cd76c6db4d6d3196064434f2558efe3598" protocol=ttrpc version=3 Sep 12 00:37:52.260343 systemd[1]: Started cri-containerd-06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f.scope - libcontainer container 06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f. Sep 12 00:37:52.303259 containerd[1615]: time="2025-09-12T00:37:52.303199483Z" level=info msg="StartContainer for \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" returns successfully" Sep 12 00:37:53.061743 kubelet[2932]: I0912 00:37:53.061581 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:37:53.077590 kubelet[2932]: I0912 00:37:53.077551 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8665c66cb-hjgv7" podStartSLOduration=30.983833517 podStartE2EDuration="50.077536942s" podCreationTimestamp="2025-09-12 00:37:03 +0000 UTC" firstStartedPulling="2025-09-12 00:37:32.732085003 +0000 UTC m=+47.763507213" lastFinishedPulling="2025-09-12 00:37:51.825788428 +0000 UTC m=+66.857210638" observedRunningTime="2025-09-12 00:37:52.0293527 +0000 UTC m=+67.060774921" watchObservedRunningTime="2025-09-12 00:37:53.077536942 +0000 UTC m=+68.108959155" Sep 12 00:37:53.079583 kubelet[2932]: I0912 00:37:53.079262 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-797c9d4dc7-9jk4g" podStartSLOduration=32.667502404 podStartE2EDuration="51.079241331s" podCreationTimestamp="2025-09-12 00:37:02 +0000 UTC" firstStartedPulling="2025-09-12 00:37:33.806013404 +0000 UTC m=+48.837435615" lastFinishedPulling="2025-09-12 00:37:52.217752335 +0000 UTC m=+67.249174542" observedRunningTime="2025-09-12 00:37:53.078959635 +0000 UTC m=+68.110381858" watchObservedRunningTime="2025-09-12 00:37:53.079241331 +0000 UTC m=+68.110663549" Sep 12 00:37:54.176344 containerd[1615]: time="2025-09-12T00:37:54.176311338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:54.180401 containerd[1615]: time="2025-09-12T00:37:54.176904934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 00:37:54.180401 containerd[1615]: time="2025-09-12T00:37:54.177345567Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:54.180739 containerd[1615]: time="2025-09-12T00:37:54.180480717Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:37:54.182307 containerd[1615]: time="2025-09-12T00:37:54.180855996Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.962893954s" Sep 12 00:37:54.182307 containerd[1615]: time="2025-09-12T00:37:54.180870428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 00:37:54.184611 containerd[1615]: time="2025-09-12T00:37:54.184594771Z" level=info msg="CreateContainer within sandbox \"9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 00:37:54.192092 containerd[1615]: time="2025-09-12T00:37:54.192066216Z" level=info msg="Container 8130b114f25e46af04f631b3c7fef22f1fa601f0d3dd835fc78791f481b8cad4: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:37:54.198719 containerd[1615]: time="2025-09-12T00:37:54.198700534Z" level=info msg="CreateContainer within sandbox \"9acd138f1a11de8d0f7216bf235626950eed03793d9635a8cb4329ac6364a6a9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"8130b114f25e46af04f631b3c7fef22f1fa601f0d3dd835fc78791f481b8cad4\"" Sep 12 00:37:54.199479 containerd[1615]: time="2025-09-12T00:37:54.199463515Z" level=info msg="StartContainer for \"8130b114f25e46af04f631b3c7fef22f1fa601f0d3dd835fc78791f481b8cad4\"" Sep 12 00:37:54.200545 containerd[1615]: time="2025-09-12T00:37:54.200524992Z" level=info msg="connecting to shim 8130b114f25e46af04f631b3c7fef22f1fa601f0d3dd835fc78791f481b8cad4" address="unix:///run/containerd/s/e7c9901e8ec75ccb9899fbb3670ca08ff2d1b6f900107d66bd656a79dac3406a" protocol=ttrpc version=3 Sep 12 00:37:54.242426 systemd[1]: Started cri-containerd-8130b114f25e46af04f631b3c7fef22f1fa601f0d3dd835fc78791f481b8cad4.scope - libcontainer container 8130b114f25e46af04f631b3c7fef22f1fa601f0d3dd835fc78791f481b8cad4. Sep 12 00:37:54.280370 containerd[1615]: time="2025-09-12T00:37:54.280297836Z" level=info msg="StartContainer for \"8130b114f25e46af04f631b3c7fef22f1fa601f0d3dd835fc78791f481b8cad4\" returns successfully" Sep 12 00:37:55.092224 kubelet[2932]: I0912 00:37:55.092017 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-4lfjf" podStartSLOduration=28.434199115 podStartE2EDuration="50.092003488s" podCreationTimestamp="2025-09-12 00:37:05 +0000 UTC" firstStartedPulling="2025-09-12 00:37:32.523553612 +0000 UTC m=+47.554975819" lastFinishedPulling="2025-09-12 00:37:54.181357985 +0000 UTC m=+69.212780192" observedRunningTime="2025-09-12 00:37:55.090931461 +0000 UTC m=+70.122353679" watchObservedRunningTime="2025-09-12 00:37:55.092003488 +0000 UTC m=+70.123425708" Sep 12 00:37:55.392973 kubelet[2932]: I0912 00:37:55.388292 2932 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 00:37:55.395343 kubelet[2932]: I0912 00:37:55.395291 2932 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 00:37:57.732739 containerd[1615]: time="2025-09-12T00:37:57.732644826Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee\" id:\"8d4f26b30a4212523910b393ebcdd0c8f480c33639086766c7eaefbb06af115b\" pid:5607 exited_at:{seconds:1757637477 nanos:691093506}" Sep 12 00:38:06.739780 kubelet[2932]: I0912 00:38:06.712180 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:38:12.098754 kubelet[2932]: I0912 00:38:12.098423 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:38:12.683641 containerd[1615]: time="2025-09-12T00:38:12.682823194Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\" id:\"c5bb4d5e2319728a47dd5390542bed6d112659ad9442eb8b7a0a2ecc5eeac707\" pid:5637 exited_at:{seconds:1757637492 nanos:667796528}" Sep 12 00:38:13.147395 kubelet[2932]: I0912 00:38:13.147260 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/226abc66-be39-4292-a3ac-389732e9b9f9-calico-apiserver-certs\") pod \"calico-apiserver-8665c66cb-gptfw\" (UID: \"226abc66-be39-4292-a3ac-389732e9b9f9\") " pod="calico-apiserver/calico-apiserver-8665c66cb-gptfw" Sep 12 00:38:13.147395 kubelet[2932]: I0912 00:38:13.147323 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjz4\" (UniqueName: \"kubernetes.io/projected/226abc66-be39-4292-a3ac-389732e9b9f9-kube-api-access-cbjz4\") pod \"calico-apiserver-8665c66cb-gptfw\" (UID: \"226abc66-be39-4292-a3ac-389732e9b9f9\") " pod="calico-apiserver/calico-apiserver-8665c66cb-gptfw" Sep 12 00:38:13.149184 systemd[1]: Created slice kubepods-besteffort-pod226abc66_be39_4292_a3ac_389732e9b9f9.slice - libcontainer container kubepods-besteffort-pod226abc66_be39_4292_a3ac_389732e9b9f9.slice. Sep 12 00:38:13.173810 containerd[1615]: time="2025-09-12T00:38:13.173785449Z" level=info msg="StopContainer for \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" with timeout 30 (s)" Sep 12 00:38:13.222412 containerd[1615]: time="2025-09-12T00:38:13.222374872Z" level=info msg="Stop container \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" with signal terminated" Sep 12 00:38:13.336817 systemd[1]: cri-containerd-77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052.scope: Deactivated successfully. Sep 12 00:38:13.388059 containerd[1615]: time="2025-09-12T00:38:13.388031630Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" id:\"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" pid:5414 exit_status:1 exited_at:{seconds:1757637493 nanos:387106847}" Sep 12 00:38:13.436970 containerd[1615]: time="2025-09-12T00:38:13.436303576Z" level=info msg="received exit event container_id:\"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" id:\"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" pid:5414 exit_status:1 exited_at:{seconds:1757637493 nanos:387106847}" Sep 12 00:38:13.477092 containerd[1615]: time="2025-09-12T00:38:13.477063165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8665c66cb-gptfw,Uid:226abc66-be39-4292-a3ac-389732e9b9f9,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:38:13.577367 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052-rootfs.mount: Deactivated successfully. Sep 12 00:38:13.714237 containerd[1615]: time="2025-09-12T00:38:13.713980638Z" level=info msg="StopContainer for \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" returns successfully" Sep 12 00:38:13.719333 containerd[1615]: time="2025-09-12T00:38:13.719312430Z" level=info msg="StopPodSandbox for \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\"" Sep 12 00:38:13.720125 containerd[1615]: time="2025-09-12T00:38:13.720112438Z" level=info msg="Container to stop \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 12 00:38:13.762945 systemd[1]: cri-containerd-64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f.scope: Deactivated successfully. Sep 12 00:38:13.768276 containerd[1615]: time="2025-09-12T00:38:13.768149562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" id:\"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" pid:4748 exit_status:137 exited_at:{seconds:1757637493 nanos:766745222}" Sep 12 00:38:13.804138 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f-rootfs.mount: Deactivated successfully. Sep 12 00:38:13.824709 containerd[1615]: time="2025-09-12T00:38:13.824637064Z" level=info msg="shim disconnected" id=64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f namespace=k8s.io Sep 12 00:38:13.824709 containerd[1615]: time="2025-09-12T00:38:13.824660722Z" level=warning msg="cleaning up after shim disconnected" id=64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f namespace=k8s.io Sep 12 00:38:13.835125 containerd[1615]: time="2025-09-12T00:38:13.824665229Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 00:38:13.975867 containerd[1615]: time="2025-09-12T00:38:13.975808269Z" level=info msg="received exit event sandbox_id:\"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" exit_status:137 exited_at:{seconds:1757637493 nanos:766745222}" Sep 12 00:38:13.976778 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f-shm.mount: Deactivated successfully. Sep 12 00:38:14.315592 systemd-networkd[1536]: calid83a1f8a1af: Link DOWN Sep 12 00:38:14.315597 systemd-networkd[1536]: calid83a1f8a1af: Lost carrier Sep 12 00:38:14.857906 systemd-networkd[1536]: caliba3dd267444: Link UP Sep 12 00:38:14.858663 systemd-networkd[1536]: caliba3dd267444: Gained carrier Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.283 [INFO][5727] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0 calico-apiserver-8665c66cb- calico-apiserver 226abc66-be39-4292-a3ac-389732e9b9f9 1179 0 2025-09-12 00:38:12 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8665c66cb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8665c66cb-gptfw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliba3dd267444 [] [] }} ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-gptfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--gptfw-" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.291 [INFO][5727] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-gptfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.698 [INFO][5792] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" HandleID="k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Workload="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.703 [INFO][5792] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" HandleID="k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Workload="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fe40), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8665c66cb-gptfw", "timestamp":"2025-09-12 00:38:14.698179004 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.703 [INFO][5792] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.704 [INFO][5792] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.705 [INFO][5792] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.734 [INFO][5792] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.767 [INFO][5792] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.773 [INFO][5792] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.774 [INFO][5792] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.777 [INFO][5792] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.777 [INFO][5792] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.779 [INFO][5792] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33 Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.786 [INFO][5792] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.812 [INFO][5792] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.138/26] block=192.168.88.128/26 handle="k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.812 [INFO][5792] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.138/26] handle="k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" host="localhost" Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.812 [INFO][5792] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:38:14.902369 containerd[1615]: 2025-09-12 00:38:14.812 [INFO][5792] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.138/26] IPv6=[] ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" HandleID="k8s-pod-network.b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Workload="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" Sep 12 00:38:14.913699 containerd[1615]: 2025-09-12 00:38:14.814 [INFO][5727] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-gptfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0", GenerateName:"calico-apiserver-8665c66cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"226abc66-be39-4292-a3ac-389732e9b9f9", ResourceVersion:"1179", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 38, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8665c66cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8665c66cb-gptfw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliba3dd267444", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:38:14.913699 containerd[1615]: 2025-09-12 00:38:14.814 [INFO][5727] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.138/32] ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-gptfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" Sep 12 00:38:14.913699 containerd[1615]: 2025-09-12 00:38:14.814 [INFO][5727] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliba3dd267444 ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-gptfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" Sep 12 00:38:14.913699 containerd[1615]: 2025-09-12 00:38:14.881 [INFO][5727] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-gptfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" Sep 12 00:38:14.913699 containerd[1615]: 2025-09-12 00:38:14.882 [INFO][5727] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-gptfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0", GenerateName:"calico-apiserver-8665c66cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"226abc66-be39-4292-a3ac-389732e9b9f9", ResourceVersion:"1179", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 38, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8665c66cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33", Pod:"calico-apiserver-8665c66cb-gptfw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliba3dd267444", MAC:"4e:22:45:6c:e3:a4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:38:14.913699 containerd[1615]: 2025-09-12 00:38:14.896 [INFO][5727] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" Namespace="calico-apiserver" Pod="calico-apiserver-8665c66cb-gptfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--8665c66cb--gptfw-eth0" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.293 [INFO][5782] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.293 [INFO][5782] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" iface="eth0" netns="/var/run/netns/cni-9a46cf56-8bb8-dcd2-161c-34725374f08b" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.295 [INFO][5782] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" iface="eth0" netns="/var/run/netns/cni-9a46cf56-8bb8-dcd2-161c-34725374f08b" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.305 [INFO][5782] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" after=11.599807ms iface="eth0" netns="/var/run/netns/cni-9a46cf56-8bb8-dcd2-161c-34725374f08b" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.305 [INFO][5782] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.305 [INFO][5782] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.698 [INFO][5794] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.703 [INFO][5794] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.812 [INFO][5794] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.922 [INFO][5794] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.922 [INFO][5794] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.925 [INFO][5794] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:38:14.946348 containerd[1615]: 2025-09-12 00:38:14.932 [INFO][5782] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:14.951995 containerd[1615]: time="2025-09-12T00:38:14.947567085Z" level=info msg="TearDown network for sandbox \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" successfully" Sep 12 00:38:14.951995 containerd[1615]: time="2025-09-12T00:38:14.947591726Z" level=info msg="StopPodSandbox for \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" returns successfully" Sep 12 00:38:14.947001 systemd[1]: run-netns-cni\x2d9a46cf56\x2d8bb8\x2ddcd2\x2d161c\x2d34725374f08b.mount: Deactivated successfully. Sep 12 00:38:15.617231 containerd[1615]: time="2025-09-12T00:38:15.617201778Z" level=info msg="connecting to shim b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33" address="unix:///run/containerd/s/41de45d08fa1e8aa803e4d1d8e724cc563960397ac3d5adff0fce300c5b46bac" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:38:15.656267 containerd[1615]: time="2025-09-12T00:38:15.656225447Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f\" id:\"f15e36585f83891217511203d7783190b865b5a5e7fdc10031b4f2eef9f4bc29\" pid:5859 exited_at:{seconds:1757637495 nanos:655586521}" Sep 12 00:38:15.662370 containerd[1615]: time="2025-09-12T00:38:15.662344949Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f\" id:\"a54f0ac216deea0dee1cd2e0cb57721a3d03e19af894da27681a8fc7d439599c\" pid:5848 exited_at:{seconds:1757637495 nanos:660595124}" Sep 12 00:38:15.686764 kubelet[2932]: I0912 00:38:15.647206 2932 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:15.688396 systemd[1]: Started cri-containerd-b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33.scope - libcontainer container b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33. Sep 12 00:38:15.719952 systemd-resolved[1482]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:38:15.936215 containerd[1615]: time="2025-09-12T00:38:15.936070028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8665c66cb-gptfw,Uid:226abc66-be39-4292-a3ac-389732e9b9f9,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33\"" Sep 12 00:38:16.053289 kubelet[2932]: I0912 00:38:16.053110 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dc0e806a-7a27-438c-b07b-5454371c66d4-calico-apiserver-certs\") pod \"dc0e806a-7a27-438c-b07b-5454371c66d4\" (UID: \"dc0e806a-7a27-438c-b07b-5454371c66d4\") " Sep 12 00:38:16.053289 kubelet[2932]: I0912 00:38:16.053172 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czglw\" (UniqueName: \"kubernetes.io/projected/dc0e806a-7a27-438c-b07b-5454371c66d4-kube-api-access-czglw\") pod \"dc0e806a-7a27-438c-b07b-5454371c66d4\" (UID: \"dc0e806a-7a27-438c-b07b-5454371c66d4\") " Sep 12 00:38:16.053504 systemd-networkd[1536]: caliba3dd267444: Gained IPv6LL Sep 12 00:38:16.146753 systemd[1]: var-lib-kubelet-pods-dc0e806a\x2d7a27\x2d438c\x2db07b\x2d5454371c66d4-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 12 00:38:16.152370 systemd[1]: var-lib-kubelet-pods-dc0e806a\x2d7a27\x2d438c\x2db07b\x2d5454371c66d4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dczglw.mount: Deactivated successfully. Sep 12 00:38:16.186860 kubelet[2932]: I0912 00:38:16.160485 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0e806a-7a27-438c-b07b-5454371c66d4-kube-api-access-czglw" (OuterVolumeSpecName: "kube-api-access-czglw") pod "dc0e806a-7a27-438c-b07b-5454371c66d4" (UID: "dc0e806a-7a27-438c-b07b-5454371c66d4"). InnerVolumeSpecName "kube-api-access-czglw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 00:38:16.188712 kubelet[2932]: I0912 00:38:16.188688 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0e806a-7a27-438c-b07b-5454371c66d4-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "dc0e806a-7a27-438c-b07b-5454371c66d4" (UID: "dc0e806a-7a27-438c-b07b-5454371c66d4"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 00:38:16.247135 kubelet[2932]: I0912 00:38:16.246651 2932 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dc0e806a-7a27-438c-b07b-5454371c66d4-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 12 00:38:16.247135 kubelet[2932]: I0912 00:38:16.246678 2932 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-czglw\" (UniqueName: \"kubernetes.io/projected/dc0e806a-7a27-438c-b07b-5454371c66d4-kube-api-access-czglw\") on node \"localhost\" DevicePath \"\"" Sep 12 00:38:16.315872 containerd[1615]: time="2025-09-12T00:38:16.314566847Z" level=info msg="CreateContainer within sandbox \"b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 00:38:16.351606 systemd[1]: Removed slice kubepods-besteffort-poddc0e806a_7a27_438c_b07b_5454371c66d4.slice - libcontainer container kubepods-besteffort-poddc0e806a_7a27_438c_b07b_5454371c66d4.slice. Sep 12 00:38:16.463010 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4026383779.mount: Deactivated successfully. Sep 12 00:38:16.469507 containerd[1615]: time="2025-09-12T00:38:16.469407460Z" level=info msg="Container 30e23b7d7c14c1cdcdd73ac2b7764f15d8c48a73afc7ff502d5d812470553879: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:38:16.524867 containerd[1615]: time="2025-09-12T00:38:16.524816055Z" level=info msg="CreateContainer within sandbox \"b9c15492d4b3a525c9fb00b9326d514e8a7de64f8dbffcb10ba02818dbbb7b33\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"30e23b7d7c14c1cdcdd73ac2b7764f15d8c48a73afc7ff502d5d812470553879\"" Sep 12 00:38:16.525607 containerd[1615]: time="2025-09-12T00:38:16.525494236Z" level=info msg="StartContainer for \"30e23b7d7c14c1cdcdd73ac2b7764f15d8c48a73afc7ff502d5d812470553879\"" Sep 12 00:38:16.534775 containerd[1615]: time="2025-09-12T00:38:16.534668538Z" level=info msg="connecting to shim 30e23b7d7c14c1cdcdd73ac2b7764f15d8c48a73afc7ff502d5d812470553879" address="unix:///run/containerd/s/41de45d08fa1e8aa803e4d1d8e724cc563960397ac3d5adff0fce300c5b46bac" protocol=ttrpc version=3 Sep 12 00:38:16.573601 systemd[1]: Started cri-containerd-30e23b7d7c14c1cdcdd73ac2b7764f15d8c48a73afc7ff502d5d812470553879.scope - libcontainer container 30e23b7d7c14c1cdcdd73ac2b7764f15d8c48a73afc7ff502d5d812470553879. Sep 12 00:38:16.626895 containerd[1615]: time="2025-09-12T00:38:16.626865402Z" level=info msg="StartContainer for \"30e23b7d7c14c1cdcdd73ac2b7764f15d8c48a73afc7ff502d5d812470553879\" returns successfully" Sep 12 00:38:17.172965 kubelet[2932]: I0912 00:38:17.160673 2932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0e806a-7a27-438c-b07b-5454371c66d4" path="/var/lib/kubelet/pods/dc0e806a-7a27-438c-b07b-5454371c66d4/volumes" Sep 12 00:38:19.609596 systemd[1]: Started sshd@7-139.178.70.108:22-139.178.68.195:51966.service - OpenSSH per-connection server daemon (139.178.68.195:51966). Sep 12 00:38:19.828329 sshd[5987]: Accepted publickey for core from 139.178.68.195 port 51966 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:19.832630 sshd-session[5987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:19.844367 systemd-logind[1587]: New session 10 of user core. Sep 12 00:38:19.848743 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 00:38:20.076142 kubelet[2932]: I0912 00:38:19.891608 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8665c66cb-gptfw" podStartSLOduration=7.87060357 podStartE2EDuration="7.87060357s" podCreationTimestamp="2025-09-12 00:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:38:17.365353415 +0000 UTC m=+92.396775635" watchObservedRunningTime="2025-09-12 00:38:19.87060357 +0000 UTC m=+94.902025784" Sep 12 00:38:20.219213 containerd[1615]: time="2025-09-12T00:38:20.219187762Z" level=info msg="StopContainer for \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" with timeout 30 (s)" Sep 12 00:38:20.371366 containerd[1615]: time="2025-09-12T00:38:20.371228645Z" level=info msg="Stop container \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" with signal terminated" Sep 12 00:38:20.491614 systemd[1]: cri-containerd-06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f.scope: Deactivated successfully. Sep 12 00:38:20.491846 systemd[1]: cri-containerd-06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f.scope: Consumed 602ms CPU time, 59.6M memory peak, 2.1M read from disk. Sep 12 00:38:20.651646 containerd[1615]: time="2025-09-12T00:38:20.651434398Z" level=info msg="received exit event container_id:\"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" id:\"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" pid:5524 exit_status:1 exited_at:{seconds:1757637500 nanos:522808502}" Sep 12 00:38:20.656892 containerd[1615]: time="2025-09-12T00:38:20.656862503Z" level=info msg="TaskExit event in podsandbox handler container_id:\"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" id:\"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" pid:5524 exit_status:1 exited_at:{seconds:1757637500 nanos:522808502}" Sep 12 00:38:20.775590 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f-rootfs.mount: Deactivated successfully. Sep 12 00:38:20.796913 containerd[1615]: time="2025-09-12T00:38:20.796726055Z" level=info msg="StopContainer for \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" returns successfully" Sep 12 00:38:20.806356 containerd[1615]: time="2025-09-12T00:38:20.806307114Z" level=info msg="StopPodSandbox for \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\"" Sep 12 00:38:20.828025 containerd[1615]: time="2025-09-12T00:38:20.827988124Z" level=info msg="Container to stop \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 12 00:38:20.851163 sshd[5989]: Connection closed by 139.178.68.195 port 51966 Sep 12 00:38:20.852044 sshd-session[5987]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:20.852702 systemd[1]: cri-containerd-78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14.scope: Deactivated successfully. Sep 12 00:38:20.861822 containerd[1615]: time="2025-09-12T00:38:20.861691235Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" id:\"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" pid:5075 exit_status:137 exited_at:{seconds:1757637500 nanos:861435575}" Sep 12 00:38:20.862976 systemd[1]: sshd@7-139.178.70.108:22-139.178.68.195:51966.service: Deactivated successfully. Sep 12 00:38:20.864137 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 00:38:20.865486 systemd[1]: session-10.scope: Consumed 236ms CPU time, 64.7M memory peak. Sep 12 00:38:20.866382 systemd-logind[1587]: Session 10 logged out. Waiting for processes to exit. Sep 12 00:38:20.868797 systemd-logind[1587]: Removed session 10. Sep 12 00:38:20.894661 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14-rootfs.mount: Deactivated successfully. Sep 12 00:38:20.901371 containerd[1615]: time="2025-09-12T00:38:20.901280661Z" level=info msg="shim disconnected" id=78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14 namespace=k8s.io Sep 12 00:38:20.901371 containerd[1615]: time="2025-09-12T00:38:20.901314239Z" level=warning msg="cleaning up after shim disconnected" id=78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14 namespace=k8s.io Sep 12 00:38:20.911121 containerd[1615]: time="2025-09-12T00:38:20.901319720Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 00:38:20.980845 containerd[1615]: time="2025-09-12T00:38:20.980215849Z" level=info msg="received exit event sandbox_id:\"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" exit_status:137 exited_at:{seconds:1757637500 nanos:861435575}" Sep 12 00:38:20.991206 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14-shm.mount: Deactivated successfully. Sep 12 00:38:21.532753 systemd-networkd[1536]: calief4eafbfdc1: Link DOWN Sep 12 00:38:21.532851 systemd-networkd[1536]: calief4eafbfdc1: Lost carrier Sep 12 00:38:22.142609 kubelet[2932]: I0912 00:38:22.131728 2932 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.488 [INFO][6076] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.496 [INFO][6076] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" iface="eth0" netns="/var/run/netns/cni-f63b2262-b6a1-6748-0b61-a4795080a3f0" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.496 [INFO][6076] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" iface="eth0" netns="/var/run/netns/cni-f63b2262-b6a1-6748-0b61-a4795080a3f0" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.523 [INFO][6076] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" after=27.366346ms iface="eth0" netns="/var/run/netns/cni-f63b2262-b6a1-6748-0b61-a4795080a3f0" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.523 [INFO][6076] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.523 [INFO][6076] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.966 [INFO][6083] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.972 [INFO][6083] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:21.974 [INFO][6083] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:22.140 [INFO][6083] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:22.140 [INFO][6083] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:22.143 [INFO][6083] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:38:22.147277 containerd[1615]: 2025-09-12 00:38:22.145 [INFO][6076] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:22.155653 systemd[1]: run-netns-cni\x2df63b2262\x2db6a1\x2d6748\x2d0b61\x2da4795080a3f0.mount: Deactivated successfully. Sep 12 00:38:22.157947 containerd[1615]: time="2025-09-12T00:38:22.149724478Z" level=info msg="TearDown network for sandbox \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" successfully" Sep 12 00:38:22.158007 containerd[1615]: time="2025-09-12T00:38:22.157947818Z" level=info msg="StopPodSandbox for \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" returns successfully" Sep 12 00:38:22.425317 kubelet[2932]: I0912 00:38:22.424840 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdx29\" (UniqueName: \"kubernetes.io/projected/5cff39d8-e5be-4f52-9e94-587e9399478a-kube-api-access-zdx29\") pod \"5cff39d8-e5be-4f52-9e94-587e9399478a\" (UID: \"5cff39d8-e5be-4f52-9e94-587e9399478a\") " Sep 12 00:38:22.425317 kubelet[2932]: I0912 00:38:22.424912 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5cff39d8-e5be-4f52-9e94-587e9399478a-calico-apiserver-certs\") pod \"5cff39d8-e5be-4f52-9e94-587e9399478a\" (UID: \"5cff39d8-e5be-4f52-9e94-587e9399478a\") " Sep 12 00:38:22.652323 systemd[1]: var-lib-kubelet-pods-5cff39d8\x2de5be\x2d4f52\x2d9e94\x2d587e9399478a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzdx29.mount: Deactivated successfully. Sep 12 00:38:22.654337 systemd[1]: var-lib-kubelet-pods-5cff39d8\x2de5be\x2d4f52\x2d9e94\x2d587e9399478a-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 12 00:38:22.762644 kubelet[2932]: I0912 00:38:22.713206 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cff39d8-e5be-4f52-9e94-587e9399478a-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "5cff39d8-e5be-4f52-9e94-587e9399478a" (UID: "5cff39d8-e5be-4f52-9e94-587e9399478a"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 00:38:22.762644 kubelet[2932]: I0912 00:38:22.759507 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cff39d8-e5be-4f52-9e94-587e9399478a-kube-api-access-zdx29" (OuterVolumeSpecName: "kube-api-access-zdx29") pod "5cff39d8-e5be-4f52-9e94-587e9399478a" (UID: "5cff39d8-e5be-4f52-9e94-587e9399478a"). InnerVolumeSpecName "kube-api-access-zdx29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 00:38:22.864170 kubelet[2932]: I0912 00:38:22.864123 2932 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdx29\" (UniqueName: \"kubernetes.io/projected/5cff39d8-e5be-4f52-9e94-587e9399478a-kube-api-access-zdx29\") on node \"localhost\" DevicePath \"\"" Sep 12 00:38:22.864170 kubelet[2932]: I0912 00:38:22.864154 2932 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5cff39d8-e5be-4f52-9e94-587e9399478a-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 12 00:38:23.071855 systemd[1]: Removed slice kubepods-besteffort-pod5cff39d8_e5be_4f52_9e94_587e9399478a.slice - libcontainer container kubepods-besteffort-pod5cff39d8_e5be_4f52_9e94_587e9399478a.slice. Sep 12 00:38:23.071925 systemd[1]: kubepods-besteffort-pod5cff39d8_e5be_4f52_9e94_587e9399478a.slice: Consumed 628ms CPU time, 60.2M memory peak, 2.1M read from disk. Sep 12 00:38:25.545262 kubelet[2932]: I0912 00:38:25.425559 2932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cff39d8-e5be-4f52-9e94-587e9399478a" path="/var/lib/kubelet/pods/5cff39d8-e5be-4f52-9e94-587e9399478a/volumes" Sep 12 00:38:25.944551 systemd[1]: Started sshd@8-139.178.70.108:22-139.178.68.195:59656.service - OpenSSH per-connection server daemon (139.178.68.195:59656). Sep 12 00:38:26.277155 sshd[6104]: Accepted publickey for core from 139.178.68.195 port 59656 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:26.282697 sshd-session[6104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:26.290523 systemd-logind[1587]: New session 11 of user core. Sep 12 00:38:26.296400 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 00:38:27.864487 containerd[1615]: time="2025-09-12T00:38:27.864454990Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee\" id:\"0b26bc18890f8a5061fd48c4ab6c9042ddc88a8c918363f534ef73da7ecb6879\" pid:6125 exited_at:{seconds:1757637507 nanos:863460200}" Sep 12 00:38:28.608131 sshd[6106]: Connection closed by 139.178.68.195 port 59656 Sep 12 00:38:28.607762 sshd-session[6104]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:28.624016 systemd[1]: sshd@8-139.178.70.108:22-139.178.68.195:59656.service: Deactivated successfully. Sep 12 00:38:28.627296 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 00:38:28.631722 systemd-logind[1587]: Session 11 logged out. Waiting for processes to exit. Sep 12 00:38:28.633959 systemd-logind[1587]: Removed session 11. Sep 12 00:38:33.632646 systemd[1]: Started sshd@9-139.178.70.108:22-139.178.68.195:34562.service - OpenSSH per-connection server daemon (139.178.68.195:34562). Sep 12 00:38:34.087546 sshd[6148]: Accepted publickey for core from 139.178.68.195 port 34562 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:34.121169 sshd-session[6148]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:34.131274 systemd-logind[1587]: New session 12 of user core. Sep 12 00:38:34.137337 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 00:38:34.897270 sshd[6150]: Connection closed by 139.178.68.195 port 34562 Sep 12 00:38:34.898323 sshd-session[6148]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:34.906725 systemd[1]: sshd@9-139.178.70.108:22-139.178.68.195:34562.service: Deactivated successfully. Sep 12 00:38:34.907959 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 00:38:34.912550 systemd-logind[1587]: Session 12 logged out. Waiting for processes to exit. Sep 12 00:38:34.913600 systemd-logind[1587]: Removed session 12. Sep 12 00:38:34.915834 systemd[1]: Started sshd@10-139.178.70.108:22-139.178.68.195:34566.service - OpenSSH per-connection server daemon (139.178.68.195:34566). Sep 12 00:38:34.977373 sshd[6163]: Accepted publickey for core from 139.178.68.195 port 34566 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:34.978265 sshd-session[6163]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:34.982126 systemd-logind[1587]: New session 13 of user core. Sep 12 00:38:34.987547 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 00:38:35.377899 sshd[6165]: Connection closed by 139.178.68.195 port 34566 Sep 12 00:38:35.378399 sshd-session[6163]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:35.386098 systemd[1]: sshd@10-139.178.70.108:22-139.178.68.195:34566.service: Deactivated successfully. Sep 12 00:38:35.388226 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 00:38:35.388974 systemd-logind[1587]: Session 13 logged out. Waiting for processes to exit. Sep 12 00:38:35.391912 systemd[1]: Started sshd@11-139.178.70.108:22-139.178.68.195:34580.service - OpenSSH per-connection server daemon (139.178.68.195:34580). Sep 12 00:38:35.392863 systemd-logind[1587]: Removed session 13. Sep 12 00:38:35.497732 sshd[6174]: Accepted publickey for core from 139.178.68.195 port 34580 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:35.498656 sshd-session[6174]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:35.501447 systemd-logind[1587]: New session 14 of user core. Sep 12 00:38:35.506569 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 00:38:35.654200 sshd[6176]: Connection closed by 139.178.68.195 port 34580 Sep 12 00:38:35.654603 sshd-session[6174]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:35.658009 systemd[1]: sshd@11-139.178.70.108:22-139.178.68.195:34580.service: Deactivated successfully. Sep 12 00:38:35.659552 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 00:38:35.660183 systemd-logind[1587]: Session 14 logged out. Waiting for processes to exit. Sep 12 00:38:35.661628 systemd-logind[1587]: Removed session 14. Sep 12 00:38:40.666867 systemd[1]: Started sshd@12-139.178.70.108:22-139.178.68.195:53612.service - OpenSSH per-connection server daemon (139.178.68.195:53612). Sep 12 00:38:41.231297 sshd[6214]: Accepted publickey for core from 139.178.68.195 port 53612 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:41.234367 sshd-session[6214]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:41.239225 systemd-logind[1587]: New session 15 of user core. Sep 12 00:38:41.244372 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 00:38:41.545503 containerd[1615]: time="2025-09-12T00:38:41.545465917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\" id:\"0fd7c70f0fcd0a6678b56872831bf15fad6fe980f29c2babe456b1e8eb2b5cd6\" pid:6204 exited_at:{seconds:1757637521 nanos:491429793}" Sep 12 00:38:41.557157 containerd[1615]: time="2025-09-12T00:38:41.545514757Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c12256eb5c449b8dc6fcf684dc25db2d846b2223dc6059ecaa4cd1eb8a55ab43\" id:\"22589adca629b3b1f6fea3f413105b73bb91a336bc3d9c19deb4eca77dd03160\" pid:6227 exited_at:{seconds:1757637521 nanos:498696212}" Sep 12 00:38:41.857816 sshd[6238]: Connection closed by 139.178.68.195 port 53612 Sep 12 00:38:41.858496 sshd-session[6214]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:41.862665 systemd[1]: sshd@12-139.178.70.108:22-139.178.68.195:53612.service: Deactivated successfully. Sep 12 00:38:41.864622 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 00:38:41.865512 systemd-logind[1587]: Session 15 logged out. Waiting for processes to exit. Sep 12 00:38:41.867087 systemd-logind[1587]: Removed session 15. Sep 12 00:38:45.052201 containerd[1615]: time="2025-09-12T00:38:45.051625233Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f80a3a15970097a848a736b98be810bb36140785b4c23bdf867b1df30fa99b6f\" id:\"3701c2350a54ce3cb2484811f4de728f29db85a978ffd409139236593ed2bba4\" pid:6267 exited_at:{seconds:1757637525 nanos:51267974}" Sep 12 00:38:45.474602 kubelet[2932]: I0912 00:38:45.474497 2932 scope.go:117] "RemoveContainer" containerID="06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f" Sep 12 00:38:45.674911 containerd[1615]: time="2025-09-12T00:38:45.674868793Z" level=info msg="RemoveContainer for \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\"" Sep 12 00:38:45.819180 containerd[1615]: time="2025-09-12T00:38:45.819148628Z" level=info msg="RemoveContainer for \"06d797922abbe8c3c5546a686256c97fbd540b2fc1629a2304e5e5cbd1b6725f\" returns successfully" Sep 12 00:38:45.846716 kubelet[2932]: I0912 00:38:45.846691 2932 scope.go:117] "RemoveContainer" containerID="77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052" Sep 12 00:38:45.871058 containerd[1615]: time="2025-09-12T00:38:45.871021270Z" level=info msg="RemoveContainer for \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\"" Sep 12 00:38:45.880054 containerd[1615]: time="2025-09-12T00:38:45.880029028Z" level=info msg="RemoveContainer for \"77206b2264cf0e87f85536db5b7068f07d62b05ed823d29e40f4b84498c82052\" returns successfully" Sep 12 00:38:45.884162 containerd[1615]: time="2025-09-12T00:38:45.884140852Z" level=info msg="StopPodSandbox for \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\"" Sep 12 00:38:46.925017 systemd[1]: Started sshd@13-139.178.70.108:22-139.178.68.195:53620.service - OpenSSH per-connection server daemon (139.178.68.195:53620). Sep 12 00:38:47.223983 sshd[6301]: Accepted publickey for core from 139.178.68.195 port 53620 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:47.234071 sshd-session[6301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:47.247934 systemd-logind[1587]: New session 16 of user core. Sep 12 00:38:47.252429 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:46.651 [WARNING][6287] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:46.654 [INFO][6287] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:46.654 [INFO][6287] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" iface="eth0" netns="" Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:46.654 [INFO][6287] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:46.654 [INFO][6287] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:47.255 [INFO][6294] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:47.262 [INFO][6294] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:47.262 [INFO][6294] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:47.279 [WARNING][6294] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:47.279 [INFO][6294] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:47.281 [INFO][6294] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:38:47.311292 containerd[1615]: 2025-09-12 00:38:47.299 [INFO][6287] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:47.322382 containerd[1615]: time="2025-09-12T00:38:47.322352054Z" level=info msg="TearDown network for sandbox \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" successfully" Sep 12 00:38:47.322382 containerd[1615]: time="2025-09-12T00:38:47.322380003Z" level=info msg="StopPodSandbox for \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" returns successfully" Sep 12 00:38:47.399029 containerd[1615]: time="2025-09-12T00:38:47.398546866Z" level=info msg="RemovePodSandbox for \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\"" Sep 12 00:38:47.399029 containerd[1615]: time="2025-09-12T00:38:47.398598625Z" level=info msg="Forcibly stopping sandbox \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\"" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.631 [WARNING][6317] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.631 [INFO][6317] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.631 [INFO][6317] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" iface="eth0" netns="" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.631 [INFO][6317] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.631 [INFO][6317] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.796 [INFO][6325] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.797 [INFO][6325] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.798 [INFO][6325] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.840 [WARNING][6325] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.840 [INFO][6325] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" HandleID="k8s-pod-network.64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--nwcb6-eth0" Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.841 [INFO][6325] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:38:47.844967 containerd[1615]: 2025-09-12 00:38:47.843 [INFO][6317] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f" Sep 12 00:38:47.850097 containerd[1615]: time="2025-09-12T00:38:47.845043173Z" level=info msg="TearDown network for sandbox \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" successfully" Sep 12 00:38:47.860267 containerd[1615]: time="2025-09-12T00:38:47.860224229Z" level=info msg="Ensure that sandbox 64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f in task-service has been cleanup successfully" Sep 12 00:38:47.869084 containerd[1615]: time="2025-09-12T00:38:47.868983844Z" level=info msg="RemovePodSandbox \"64df168a678218789682ec9e6e85d1f1f8851222e88cb61544e01dad8f37ed4f\" returns successfully" Sep 12 00:38:47.880152 containerd[1615]: time="2025-09-12T00:38:47.880121998Z" level=info msg="StopPodSandbox for \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\"" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.914 [WARNING][6339] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.914 [INFO][6339] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.914 [INFO][6339] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" iface="eth0" netns="" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.914 [INFO][6339] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.914 [INFO][6339] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.930 [INFO][6347] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.930 [INFO][6347] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.930 [INFO][6347] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.935 [WARNING][6347] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.935 [INFO][6347] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.936 [INFO][6347] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:38:47.940070 containerd[1615]: 2025-09-12 00:38:47.938 [INFO][6339] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:47.960600 containerd[1615]: time="2025-09-12T00:38:47.940165873Z" level=info msg="TearDown network for sandbox \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" successfully" Sep 12 00:38:47.960600 containerd[1615]: time="2025-09-12T00:38:47.940183825Z" level=info msg="StopPodSandbox for \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" returns successfully" Sep 12 00:38:47.960600 containerd[1615]: time="2025-09-12T00:38:47.940639706Z" level=info msg="RemovePodSandbox for \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\"" Sep 12 00:38:47.960600 containerd[1615]: time="2025-09-12T00:38:47.940659564Z" level=info msg="Forcibly stopping sandbox \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\"" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.022 [WARNING][6363] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" WorkloadEndpoint="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.022 [INFO][6363] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.022 [INFO][6363] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" iface="eth0" netns="" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.022 [INFO][6363] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.022 [INFO][6363] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.037 [INFO][6370] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.038 [INFO][6370] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.038 [INFO][6370] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.057 [WARNING][6370] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.057 [INFO][6370] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" HandleID="k8s-pod-network.78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Workload="localhost-k8s-calico--apiserver--797c9d4dc7--9jk4g-eth0" Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.059 [INFO][6370] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:38:48.063337 containerd[1615]: 2025-09-12 00:38:48.061 [INFO][6363] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14" Sep 12 00:38:48.065836 containerd[1615]: time="2025-09-12T00:38:48.064116465Z" level=info msg="TearDown network for sandbox \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" successfully" Sep 12 00:38:48.068278 containerd[1615]: time="2025-09-12T00:38:48.068121665Z" level=info msg="Ensure that sandbox 78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14 in task-service has been cleanup successfully" Sep 12 00:38:48.073267 containerd[1615]: time="2025-09-12T00:38:48.073218256Z" level=info msg="RemovePodSandbox \"78143af7d9d6deda2544acf1fbcad5ba14fef251f923a18e2d1512bc8171ca14\" returns successfully" Sep 12 00:38:48.994513 sshd[6304]: Connection closed by 139.178.68.195 port 53620 Sep 12 00:38:48.994363 sshd-session[6301]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:49.001886 systemd[1]: sshd@13-139.178.70.108:22-139.178.68.195:53620.service: Deactivated successfully. Sep 12 00:38:49.003891 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 00:38:49.004905 systemd-logind[1587]: Session 16 logged out. Waiting for processes to exit. Sep 12 00:38:49.006730 systemd[1]: Started sshd@14-139.178.70.108:22-139.178.68.195:53630.service - OpenSSH per-connection server daemon (139.178.68.195:53630). Sep 12 00:38:49.007762 systemd-logind[1587]: Removed session 16. Sep 12 00:38:49.121011 sshd[6381]: Accepted publickey for core from 139.178.68.195 port 53630 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:49.121997 sshd-session[6381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:49.124923 systemd-logind[1587]: New session 17 of user core. Sep 12 00:38:49.135444 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 00:38:50.047949 sshd[6383]: Connection closed by 139.178.68.195 port 53630 Sep 12 00:38:50.104509 sshd-session[6381]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:50.109825 systemd[1]: Started sshd@15-139.178.70.108:22-139.178.68.195:40362.service - OpenSSH per-connection server daemon (139.178.68.195:40362). Sep 12 00:38:50.130584 systemd[1]: sshd@14-139.178.70.108:22-139.178.68.195:53630.service: Deactivated successfully. Sep 12 00:38:50.132523 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 00:38:50.136593 systemd-logind[1587]: Session 17 logged out. Waiting for processes to exit. Sep 12 00:38:50.138594 systemd-logind[1587]: Removed session 17. Sep 12 00:38:50.260145 sshd[6390]: Accepted publickey for core from 139.178.68.195 port 40362 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:50.262702 sshd-session[6390]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:50.268460 systemd-logind[1587]: New session 18 of user core. Sep 12 00:38:50.274382 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 00:38:51.311427 sshd[6395]: Connection closed by 139.178.68.195 port 40362 Sep 12 00:38:51.312465 sshd-session[6390]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:51.322471 systemd[1]: sshd@15-139.178.70.108:22-139.178.68.195:40362.service: Deactivated successfully. Sep 12 00:38:51.324483 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 00:38:51.325401 systemd-logind[1587]: Session 18 logged out. Waiting for processes to exit. Sep 12 00:38:51.328804 systemd[1]: Started sshd@16-139.178.70.108:22-139.178.68.195:40364.service - OpenSSH per-connection server daemon (139.178.68.195:40364). Sep 12 00:38:51.329547 systemd-logind[1587]: Removed session 18. Sep 12 00:38:51.469645 sshd[6410]: Accepted publickey for core from 139.178.68.195 port 40364 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:51.473428 sshd-session[6410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:51.480288 systemd-logind[1587]: New session 19 of user core. Sep 12 00:38:51.483377 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 00:38:52.665452 sshd[6413]: Connection closed by 139.178.68.195 port 40364 Sep 12 00:38:52.665870 sshd-session[6410]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:52.695111 systemd[1]: sshd@16-139.178.70.108:22-139.178.68.195:40364.service: Deactivated successfully. Sep 12 00:38:52.698214 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 00:38:52.699410 systemd[1]: session-19.scope: Consumed 479ms CPU time, 68.5M memory peak. Sep 12 00:38:52.702115 systemd-logind[1587]: Session 19 logged out. Waiting for processes to exit. Sep 12 00:38:52.707338 systemd[1]: Started sshd@17-139.178.70.108:22-139.178.68.195:40366.service - OpenSSH per-connection server daemon (139.178.68.195:40366). Sep 12 00:38:52.709873 systemd-logind[1587]: Removed session 19. Sep 12 00:38:52.842267 sshd[6432]: Accepted publickey for core from 139.178.68.195 port 40366 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:52.843948 sshd-session[6432]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:52.847714 systemd-logind[1587]: New session 20 of user core. Sep 12 00:38:52.854431 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 00:38:53.096120 sshd[6434]: Connection closed by 139.178.68.195 port 40366 Sep 12 00:38:53.134500 sshd-session[6432]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:53.204701 systemd[1]: sshd@17-139.178.70.108:22-139.178.68.195:40366.service: Deactivated successfully. Sep 12 00:38:53.206891 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 00:38:53.208206 systemd-logind[1587]: Session 20 logged out. Waiting for processes to exit. Sep 12 00:38:53.208944 systemd-logind[1587]: Removed session 20. Sep 12 00:38:57.172765 containerd[1615]: time="2025-09-12T00:38:57.172728583Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baeba706ef8610b1da4c71a68aaaa9e8bb19702c4daf7338523b9f0bded9d6ee\" id:\"d971a6bbdc08040e221c9f83f9f6df72a1ccfdf8e70194b5f80fb0db513d8ea2\" pid:6463 exited_at:{seconds:1757637537 nanos:146047534}" Sep 12 00:38:58.127777 systemd[1]: Started sshd@18-139.178.70.108:22-139.178.68.195:40376.service - OpenSSH per-connection server daemon (139.178.68.195:40376). Sep 12 00:38:58.248785 sshd[6475]: Accepted publickey for core from 139.178.68.195 port 40376 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:38:58.251663 sshd-session[6475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:38:58.258005 systemd-logind[1587]: New session 21 of user core. Sep 12 00:38:58.262345 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 00:38:58.895658 sshd[6477]: Connection closed by 139.178.68.195 port 40376 Sep 12 00:38:58.895295 sshd-session[6475]: pam_unix(sshd:session): session closed for user core Sep 12 00:38:58.897678 systemd[1]: sshd@18-139.178.70.108:22-139.178.68.195:40376.service: Deactivated successfully. Sep 12 00:38:58.899032 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 00:38:58.902091 systemd-logind[1587]: Session 21 logged out. Waiting for processes to exit. Sep 12 00:38:58.902917 systemd-logind[1587]: Removed session 21. Sep 12 00:39:03.909423 systemd[1]: Started sshd@19-139.178.70.108:22-139.178.68.195:46814.service - OpenSSH per-connection server daemon (139.178.68.195:46814). Sep 12 00:39:04.062442 sshd[6505]: Accepted publickey for core from 139.178.68.195 port 46814 ssh2: RSA SHA256:cc7+gY4pedxGskzWUQnkx6HLMvEKLPoZPuQgwUGPYHk Sep 12 00:39:04.063939 sshd-session[6505]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:39:04.068295 systemd-logind[1587]: New session 22 of user core. Sep 12 00:39:04.074553 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 00:39:04.496537 sshd[6507]: Connection closed by 139.178.68.195 port 46814 Sep 12 00:39:04.497413 sshd-session[6505]: pam_unix(sshd:session): session closed for user core Sep 12 00:39:04.501017 systemd[1]: sshd@19-139.178.70.108:22-139.178.68.195:46814.service: Deactivated successfully. Sep 12 00:39:04.503239 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 00:39:04.504002 systemd-logind[1587]: Session 22 logged out. Waiting for processes to exit. Sep 12 00:39:04.505781 systemd-logind[1587]: Removed session 22.