Aug 13 00:31:30.700997 kernel: Linux version 6.12.40-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Aug 12 21:42:48 -00 2025 Aug 13 00:31:30.701014 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:31:30.701020 kernel: Disabled fast string operations Aug 13 00:31:30.701025 kernel: BIOS-provided physical RAM map: Aug 13 00:31:30.701029 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Aug 13 00:31:30.701033 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Aug 13 00:31:30.701039 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Aug 13 00:31:30.701045 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Aug 13 00:31:30.701053 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Aug 13 00:31:30.701060 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Aug 13 00:31:30.701064 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Aug 13 00:31:30.701069 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Aug 13 00:31:30.701073 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Aug 13 00:31:30.701077 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Aug 13 00:31:30.701084 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Aug 13 00:31:30.701089 kernel: NX (Execute Disable) protection: active Aug 13 00:31:30.701094 kernel: APIC: Static calls initialized Aug 13 00:31:30.701099 kernel: SMBIOS 2.7 present. Aug 13 00:31:30.701104 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Aug 13 00:31:30.701109 kernel: DMI: Memory slots populated: 1/128 Aug 13 00:31:30.701115 kernel: vmware: hypercall mode: 0x00 Aug 13 00:31:30.701120 kernel: Hypervisor detected: VMware Aug 13 00:31:30.701125 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Aug 13 00:31:30.701130 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Aug 13 00:31:30.701135 kernel: vmware: using clock offset of 3264835896 ns Aug 13 00:31:30.701140 kernel: tsc: Detected 3408.000 MHz processor Aug 13 00:31:30.701156 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 13 00:31:30.701163 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 13 00:31:30.701168 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Aug 13 00:31:30.701173 kernel: total RAM covered: 3072M Aug 13 00:31:30.701180 kernel: Found optimal setting for mtrr clean up Aug 13 00:31:30.701186 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Aug 13 00:31:30.701191 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Aug 13 00:31:30.701197 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 13 00:31:30.701202 kernel: Using GB pages for direct mapping Aug 13 00:31:30.701207 kernel: ACPI: Early table checksum verification disabled Aug 13 00:31:30.701212 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Aug 13 00:31:30.701217 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Aug 13 00:31:30.701222 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Aug 13 00:31:30.701228 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Aug 13 00:31:30.701235 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Aug 13 00:31:30.701240 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Aug 13 00:31:30.701245 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Aug 13 00:31:30.701250 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Aug 13 00:31:30.701260 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Aug 13 00:31:30.701267 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Aug 13 00:31:30.701272 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Aug 13 00:31:30.701278 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Aug 13 00:31:30.701283 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Aug 13 00:31:30.701288 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Aug 13 00:31:30.701294 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Aug 13 00:31:30.701299 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Aug 13 00:31:30.701304 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Aug 13 00:31:30.701309 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Aug 13 00:31:30.701316 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Aug 13 00:31:30.701321 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Aug 13 00:31:30.701326 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Aug 13 00:31:30.701331 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Aug 13 00:31:30.701336 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Aug 13 00:31:30.701342 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Aug 13 00:31:30.701347 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Aug 13 00:31:30.701353 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Aug 13 00:31:30.701358 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Aug 13 00:31:30.701364 kernel: Zone ranges: Aug 13 00:31:30.701369 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 13 00:31:30.701375 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Aug 13 00:31:30.701380 kernel: Normal empty Aug 13 00:31:30.701385 kernel: Device empty Aug 13 00:31:30.701393 kernel: Movable zone start for each node Aug 13 00:31:30.701402 kernel: Early memory node ranges Aug 13 00:31:30.701411 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Aug 13 00:31:30.701419 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Aug 13 00:31:30.701429 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Aug 13 00:31:30.701435 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Aug 13 00:31:30.701440 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 13 00:31:30.701445 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Aug 13 00:31:30.701453 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Aug 13 00:31:30.701461 kernel: ACPI: PM-Timer IO Port: 0x1008 Aug 13 00:31:30.701470 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Aug 13 00:31:30.701479 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Aug 13 00:31:30.701487 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Aug 13 00:31:30.701496 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Aug 13 00:31:30.701506 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Aug 13 00:31:30.701513 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Aug 13 00:31:30.701520 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Aug 13 00:31:30.701528 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Aug 13 00:31:30.701537 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Aug 13 00:31:30.701545 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Aug 13 00:31:30.701553 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Aug 13 00:31:30.701562 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Aug 13 00:31:30.701568 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Aug 13 00:31:30.701575 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Aug 13 00:31:30.701581 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Aug 13 00:31:30.701589 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Aug 13 00:31:30.701598 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Aug 13 00:31:30.701606 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Aug 13 00:31:30.701613 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Aug 13 00:31:30.701618 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Aug 13 00:31:30.701624 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Aug 13 00:31:30.701629 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Aug 13 00:31:30.701634 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Aug 13 00:31:30.701641 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Aug 13 00:31:30.701646 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Aug 13 00:31:30.701653 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Aug 13 00:31:30.701658 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Aug 13 00:31:30.701663 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Aug 13 00:31:30.701669 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Aug 13 00:31:30.701674 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Aug 13 00:31:30.701679 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Aug 13 00:31:30.701684 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Aug 13 00:31:30.701689 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Aug 13 00:31:30.701696 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Aug 13 00:31:30.701701 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Aug 13 00:31:30.701706 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Aug 13 00:31:30.701711 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Aug 13 00:31:30.701716 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Aug 13 00:31:30.701721 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Aug 13 00:31:30.701728 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Aug 13 00:31:30.701737 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Aug 13 00:31:30.701742 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Aug 13 00:31:30.701747 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Aug 13 00:31:30.701754 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Aug 13 00:31:30.701760 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Aug 13 00:31:30.701766 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Aug 13 00:31:30.701771 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Aug 13 00:31:30.701777 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Aug 13 00:31:30.701782 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Aug 13 00:31:30.701788 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Aug 13 00:31:30.701794 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Aug 13 00:31:30.701801 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Aug 13 00:31:30.701808 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Aug 13 00:31:30.701814 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Aug 13 00:31:30.701819 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Aug 13 00:31:30.701825 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Aug 13 00:31:30.701833 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Aug 13 00:31:30.701842 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Aug 13 00:31:30.701852 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Aug 13 00:31:30.701861 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Aug 13 00:31:30.701870 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Aug 13 00:31:30.701881 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Aug 13 00:31:30.701891 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Aug 13 00:31:30.701896 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Aug 13 00:31:30.701902 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Aug 13 00:31:30.701907 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Aug 13 00:31:30.701913 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Aug 13 00:31:30.701918 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Aug 13 00:31:30.704178 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Aug 13 00:31:30.704185 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Aug 13 00:31:30.704194 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Aug 13 00:31:30.704199 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Aug 13 00:31:30.704205 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Aug 13 00:31:30.704210 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Aug 13 00:31:30.704216 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Aug 13 00:31:30.704221 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Aug 13 00:31:30.704227 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Aug 13 00:31:30.704232 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Aug 13 00:31:30.704238 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Aug 13 00:31:30.704243 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Aug 13 00:31:30.704250 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Aug 13 00:31:30.704256 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Aug 13 00:31:30.704261 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Aug 13 00:31:30.704267 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Aug 13 00:31:30.704273 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Aug 13 00:31:30.704278 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Aug 13 00:31:30.704284 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Aug 13 00:31:30.704289 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Aug 13 00:31:30.704295 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Aug 13 00:31:30.704300 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Aug 13 00:31:30.704307 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Aug 13 00:31:30.704313 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Aug 13 00:31:30.704318 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Aug 13 00:31:30.704324 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Aug 13 00:31:30.704329 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Aug 13 00:31:30.704334 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Aug 13 00:31:30.704340 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Aug 13 00:31:30.704346 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Aug 13 00:31:30.704352 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Aug 13 00:31:30.704357 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Aug 13 00:31:30.704364 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Aug 13 00:31:30.704369 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Aug 13 00:31:30.704376 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Aug 13 00:31:30.704381 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Aug 13 00:31:30.704386 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Aug 13 00:31:30.704392 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Aug 13 00:31:30.704397 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Aug 13 00:31:30.704403 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Aug 13 00:31:30.704408 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Aug 13 00:31:30.704414 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Aug 13 00:31:30.704421 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Aug 13 00:31:30.704426 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Aug 13 00:31:30.704432 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Aug 13 00:31:30.704442 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Aug 13 00:31:30.704448 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Aug 13 00:31:30.704453 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Aug 13 00:31:30.704459 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Aug 13 00:31:30.704464 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Aug 13 00:31:30.704470 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Aug 13 00:31:30.704475 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Aug 13 00:31:30.704487 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Aug 13 00:31:30.704494 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Aug 13 00:31:30.704500 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Aug 13 00:31:30.704505 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Aug 13 00:31:30.704511 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Aug 13 00:31:30.704521 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Aug 13 00:31:30.704528 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Aug 13 00:31:30.704533 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Aug 13 00:31:30.704539 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Aug 13 00:31:30.704547 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Aug 13 00:31:30.704556 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 13 00:31:30.704562 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Aug 13 00:31:30.704568 kernel: TSC deadline timer available Aug 13 00:31:30.704574 kernel: CPU topo: Max. logical packages: 128 Aug 13 00:31:30.704579 kernel: CPU topo: Max. logical dies: 128 Aug 13 00:31:30.704588 kernel: CPU topo: Max. dies per package: 1 Aug 13 00:31:30.704594 kernel: CPU topo: Max. threads per core: 1 Aug 13 00:31:30.704599 kernel: CPU topo: Num. cores per package: 1 Aug 13 00:31:30.704605 kernel: CPU topo: Num. threads per package: 1 Aug 13 00:31:30.704612 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Aug 13 00:31:30.704623 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Aug 13 00:31:30.704629 kernel: Booting paravirtualized kernel on VMware hypervisor Aug 13 00:31:30.704636 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 13 00:31:30.704642 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Aug 13 00:31:30.704652 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Aug 13 00:31:30.704659 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Aug 13 00:31:30.704665 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Aug 13 00:31:30.704670 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Aug 13 00:31:30.704682 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Aug 13 00:31:30.704689 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Aug 13 00:31:30.704695 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Aug 13 00:31:30.704700 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Aug 13 00:31:30.704706 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Aug 13 00:31:30.704711 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Aug 13 00:31:30.704717 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Aug 13 00:31:30.704722 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Aug 13 00:31:30.704728 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Aug 13 00:31:30.704735 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Aug 13 00:31:30.704741 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Aug 13 00:31:30.704747 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Aug 13 00:31:30.704752 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Aug 13 00:31:30.704758 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Aug 13 00:31:30.704765 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:31:30.704771 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 13 00:31:30.704778 kernel: random: crng init done Aug 13 00:31:30.704783 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Aug 13 00:31:30.704789 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Aug 13 00:31:30.704795 kernel: printk: log_buf_len min size: 262144 bytes Aug 13 00:31:30.704801 kernel: printk: log_buf_len: 1048576 bytes Aug 13 00:31:30.704806 kernel: printk: early log buf free: 245576(93%) Aug 13 00:31:30.704812 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 13 00:31:30.704818 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Aug 13 00:31:30.704824 kernel: Fallback order for Node 0: 0 Aug 13 00:31:30.704830 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Aug 13 00:31:30.704837 kernel: Policy zone: DMA32 Aug 13 00:31:30.704842 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 13 00:31:30.704848 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Aug 13 00:31:30.704854 kernel: ftrace: allocating 40098 entries in 157 pages Aug 13 00:31:30.704860 kernel: ftrace: allocated 157 pages with 5 groups Aug 13 00:31:30.704865 kernel: Dynamic Preempt: voluntary Aug 13 00:31:30.704871 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 13 00:31:30.704877 kernel: rcu: RCU event tracing is enabled. Aug 13 00:31:30.704883 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Aug 13 00:31:30.704890 kernel: Trampoline variant of Tasks RCU enabled. Aug 13 00:31:30.704896 kernel: Rude variant of Tasks RCU enabled. Aug 13 00:31:30.704901 kernel: Tracing variant of Tasks RCU enabled. Aug 13 00:31:30.704907 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 13 00:31:30.704913 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Aug 13 00:31:30.704918 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Aug 13 00:31:30.704924 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Aug 13 00:31:30.704930 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Aug 13 00:31:30.704935 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Aug 13 00:31:30.704942 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Aug 13 00:31:30.704948 kernel: Console: colour VGA+ 80x25 Aug 13 00:31:30.704954 kernel: printk: legacy console [tty0] enabled Aug 13 00:31:30.704959 kernel: printk: legacy console [ttyS0] enabled Aug 13 00:31:30.704965 kernel: ACPI: Core revision 20240827 Aug 13 00:31:30.704971 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Aug 13 00:31:30.704976 kernel: APIC: Switch to symmetric I/O mode setup Aug 13 00:31:30.704982 kernel: x2apic enabled Aug 13 00:31:30.704988 kernel: APIC: Switched APIC routing to: physical x2apic Aug 13 00:31:30.704995 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Aug 13 00:31:30.705001 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Aug 13 00:31:30.705006 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Aug 13 00:31:30.705012 kernel: Disabled fast string operations Aug 13 00:31:30.705018 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Aug 13 00:31:30.705023 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Aug 13 00:31:30.705029 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 13 00:31:30.705035 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Aug 13 00:31:30.705041 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Aug 13 00:31:30.705047 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Aug 13 00:31:30.705053 kernel: RETBleed: Mitigation: Enhanced IBRS Aug 13 00:31:30.705059 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Aug 13 00:31:30.705065 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Aug 13 00:31:30.705071 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Aug 13 00:31:30.705076 kernel: SRBDS: Unknown: Dependent on hypervisor status Aug 13 00:31:30.705082 kernel: GDS: Unknown: Dependent on hypervisor status Aug 13 00:31:30.705088 kernel: ITS: Mitigation: Aligned branch/return thunks Aug 13 00:31:30.705094 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 13 00:31:30.705101 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 13 00:31:30.705115 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 13 00:31:30.705123 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 13 00:31:30.705129 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Aug 13 00:31:30.705135 kernel: Freeing SMP alternatives memory: 32K Aug 13 00:31:30.705141 kernel: pid_max: default: 131072 minimum: 1024 Aug 13 00:31:30.705156 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 13 00:31:30.705163 kernel: landlock: Up and running. Aug 13 00:31:30.705169 kernel: SELinux: Initializing. Aug 13 00:31:30.705177 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 13 00:31:30.705183 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 13 00:31:30.705188 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Aug 13 00:31:30.705194 kernel: Performance Events: Skylake events, core PMU driver. Aug 13 00:31:30.705207 kernel: core: CPUID marked event: 'cpu cycles' unavailable Aug 13 00:31:30.705214 kernel: core: CPUID marked event: 'instructions' unavailable Aug 13 00:31:30.705219 kernel: core: CPUID marked event: 'bus cycles' unavailable Aug 13 00:31:30.705242 kernel: core: CPUID marked event: 'cache references' unavailable Aug 13 00:31:30.705250 kernel: core: CPUID marked event: 'cache misses' unavailable Aug 13 00:31:30.705256 kernel: core: CPUID marked event: 'branch instructions' unavailable Aug 13 00:31:30.705261 kernel: core: CPUID marked event: 'branch misses' unavailable Aug 13 00:31:30.705267 kernel: ... version: 1 Aug 13 00:31:30.705273 kernel: ... bit width: 48 Aug 13 00:31:30.705279 kernel: ... generic registers: 4 Aug 13 00:31:30.705284 kernel: ... value mask: 0000ffffffffffff Aug 13 00:31:30.705290 kernel: ... max period: 000000007fffffff Aug 13 00:31:30.705296 kernel: ... fixed-purpose events: 0 Aug 13 00:31:30.705303 kernel: ... event mask: 000000000000000f Aug 13 00:31:30.705309 kernel: signal: max sigframe size: 1776 Aug 13 00:31:30.705315 kernel: rcu: Hierarchical SRCU implementation. Aug 13 00:31:30.705321 kernel: rcu: Max phase no-delay instances is 400. Aug 13 00:31:30.705326 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Aug 13 00:31:30.705332 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Aug 13 00:31:30.705338 kernel: smp: Bringing up secondary CPUs ... Aug 13 00:31:30.705344 kernel: smpboot: x86: Booting SMP configuration: Aug 13 00:31:30.705349 kernel: .... node #0, CPUs: #1 Aug 13 00:31:30.705356 kernel: Disabled fast string operations Aug 13 00:31:30.705362 kernel: smp: Brought up 1 node, 2 CPUs Aug 13 00:31:30.705367 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Aug 13 00:31:30.705374 kernel: Memory: 1924260K/2096628K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54444K init, 2524K bss, 160984K reserved, 0K cma-reserved) Aug 13 00:31:30.705379 kernel: devtmpfs: initialized Aug 13 00:31:30.705385 kernel: x86/mm: Memory block size: 128MB Aug 13 00:31:30.705391 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Aug 13 00:31:30.705396 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 13 00:31:30.705402 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Aug 13 00:31:30.705409 kernel: pinctrl core: initialized pinctrl subsystem Aug 13 00:31:30.705415 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 13 00:31:30.705420 kernel: audit: initializing netlink subsys (disabled) Aug 13 00:31:30.705426 kernel: audit: type=2000 audit(1755045087.274:1): state=initialized audit_enabled=0 res=1 Aug 13 00:31:30.705432 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 13 00:31:30.705438 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 13 00:31:30.705444 kernel: cpuidle: using governor menu Aug 13 00:31:30.705449 kernel: Simple Boot Flag at 0x36 set to 0x80 Aug 13 00:31:30.705455 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 13 00:31:30.705460 kernel: dca service started, version 1.12.1 Aug 13 00:31:30.705470 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Aug 13 00:31:30.705489 kernel: PCI: Using configuration type 1 for base access Aug 13 00:31:30.705500 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 13 00:31:30.705508 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 13 00:31:30.705514 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 13 00:31:30.705520 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 13 00:31:30.705526 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 13 00:31:30.705532 kernel: ACPI: Added _OSI(Module Device) Aug 13 00:31:30.705540 kernel: ACPI: Added _OSI(Processor Device) Aug 13 00:31:30.705546 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 13 00:31:30.705552 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 13 00:31:30.705558 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Aug 13 00:31:30.705564 kernel: ACPI: Interpreter enabled Aug 13 00:31:30.705570 kernel: ACPI: PM: (supports S0 S1 S5) Aug 13 00:31:30.705576 kernel: ACPI: Using IOAPIC for interrupt routing Aug 13 00:31:30.705582 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 13 00:31:30.705588 kernel: PCI: Using E820 reservations for host bridge windows Aug 13 00:31:30.705595 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Aug 13 00:31:30.705601 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Aug 13 00:31:30.705701 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 13 00:31:30.705769 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Aug 13 00:31:30.705820 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Aug 13 00:31:30.705828 kernel: PCI host bridge to bus 0000:00 Aug 13 00:31:30.705879 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 13 00:31:30.705927 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Aug 13 00:31:30.705971 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Aug 13 00:31:30.706014 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 13 00:31:30.706056 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Aug 13 00:31:30.706099 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Aug 13 00:31:30.706203 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Aug 13 00:31:30.706273 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Aug 13 00:31:30.706332 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 00:31:30.706389 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Aug 13 00:31:30.706443 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Aug 13 00:31:30.706496 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Aug 13 00:31:30.706545 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Aug 13 00:31:30.706594 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Aug 13 00:31:30.706642 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Aug 13 00:31:30.706691 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Aug 13 00:31:30.706743 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Aug 13 00:31:30.706793 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Aug 13 00:31:30.706848 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Aug 13 00:31:30.707352 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Aug 13 00:31:30.708234 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Aug 13 00:31:30.708299 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Aug 13 00:31:30.708359 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Aug 13 00:31:30.708411 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Aug 13 00:31:30.708464 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Aug 13 00:31:30.708514 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Aug 13 00:31:30.708563 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Aug 13 00:31:30.708611 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 13 00:31:30.708665 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Aug 13 00:31:30.708714 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Aug 13 00:31:30.708762 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Aug 13 00:31:30.708813 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Aug 13 00:31:30.708862 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Aug 13 00:31:30.708915 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.708965 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Aug 13 00:31:30.709014 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Aug 13 00:31:30.709063 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Aug 13 00:31:30.709112 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.709183 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.709236 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Aug 13 00:31:30.709287 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Aug 13 00:31:30.709336 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Aug 13 00:31:30.709385 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Aug 13 00:31:30.709434 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.709507 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.709565 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Aug 13 00:31:30.709615 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Aug 13 00:31:30.709664 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Aug 13 00:31:30.709713 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Aug 13 00:31:30.709762 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.709816 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.709869 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Aug 13 00:31:30.709920 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Aug 13 00:31:30.709970 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Aug 13 00:31:30.710019 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.710072 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.710122 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Aug 13 00:31:30.710864 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Aug 13 00:31:30.710930 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Aug 13 00:31:30.710984 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.711048 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.711111 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Aug 13 00:31:30.711201 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Aug 13 00:31:30.711266 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Aug 13 00:31:30.711319 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.711384 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.711457 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Aug 13 00:31:30.711512 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Aug 13 00:31:30.711562 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Aug 13 00:31:30.711612 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.711666 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.711716 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Aug 13 00:31:30.711765 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Aug 13 00:31:30.711817 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Aug 13 00:31:30.711865 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.711919 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.711974 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Aug 13 00:31:30.712027 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Aug 13 00:31:30.712077 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Aug 13 00:31:30.713507 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.713576 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.713630 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Aug 13 00:31:30.713681 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Aug 13 00:31:30.713731 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Aug 13 00:31:30.713780 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Aug 13 00:31:30.713830 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.713886 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.713940 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Aug 13 00:31:30.713991 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Aug 13 00:31:30.714041 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Aug 13 00:31:30.714090 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Aug 13 00:31:30.714139 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.714206 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.714274 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Aug 13 00:31:30.714335 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Aug 13 00:31:30.714386 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Aug 13 00:31:30.714437 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.714497 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.714713 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Aug 13 00:31:30.715215 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Aug 13 00:31:30.715270 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Aug 13 00:31:30.715324 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.715378 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.715429 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Aug 13 00:31:30.715480 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Aug 13 00:31:30.715529 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Aug 13 00:31:30.715579 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.715636 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.715689 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Aug 13 00:31:30.715739 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Aug 13 00:31:30.715789 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Aug 13 00:31:30.715840 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.715894 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.715944 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Aug 13 00:31:30.715994 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Aug 13 00:31:30.716044 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Aug 13 00:31:30.716096 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.718161 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.718224 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Aug 13 00:31:30.718276 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Aug 13 00:31:30.718326 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Aug 13 00:31:30.718376 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Aug 13 00:31:30.718426 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.718485 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.718536 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Aug 13 00:31:30.718586 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Aug 13 00:31:30.718635 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Aug 13 00:31:30.718687 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Aug 13 00:31:30.718736 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.718793 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.718860 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Aug 13 00:31:30.718910 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Aug 13 00:31:30.718959 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Aug 13 00:31:30.719008 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Aug 13 00:31:30.719060 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.719115 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.719183 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Aug 13 00:31:30.719234 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Aug 13 00:31:30.719283 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Aug 13 00:31:30.719332 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.719385 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.719437 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Aug 13 00:31:30.719485 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Aug 13 00:31:30.719534 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Aug 13 00:31:30.719583 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.719636 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.719685 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Aug 13 00:31:30.719734 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Aug 13 00:31:30.719785 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Aug 13 00:31:30.719834 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.719888 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.719938 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Aug 13 00:31:30.719987 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Aug 13 00:31:30.720036 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Aug 13 00:31:30.720084 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.720141 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.721225 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Aug 13 00:31:30.721278 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Aug 13 00:31:30.721329 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Aug 13 00:31:30.721379 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.721434 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.721486 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Aug 13 00:31:30.721540 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Aug 13 00:31:30.721591 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Aug 13 00:31:30.721642 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Aug 13 00:31:30.721692 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.721746 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.721796 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Aug 13 00:31:30.721847 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Aug 13 00:31:30.721899 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Aug 13 00:31:30.721949 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Aug 13 00:31:30.721998 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.722053 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.722104 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Aug 13 00:31:30.723187 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Aug 13 00:31:30.723243 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Aug 13 00:31:30.723297 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.723351 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.723402 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Aug 13 00:31:30.723452 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Aug 13 00:31:30.723502 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Aug 13 00:31:30.723552 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.723606 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.723659 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Aug 13 00:31:30.723709 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Aug 13 00:31:30.723759 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Aug 13 00:31:30.723809 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.723866 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.723917 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Aug 13 00:31:30.723966 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Aug 13 00:31:30.724018 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Aug 13 00:31:30.724068 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.724121 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.726295 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Aug 13 00:31:30.726367 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Aug 13 00:31:30.726430 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Aug 13 00:31:30.726500 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.726575 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Aug 13 00:31:30.726641 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Aug 13 00:31:30.726708 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Aug 13 00:31:30.726776 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Aug 13 00:31:30.726829 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.726883 kernel: pci_bus 0000:01: extended config space not accessible Aug 13 00:31:30.726935 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 00:31:30.726988 kernel: pci_bus 0000:02: extended config space not accessible Aug 13 00:31:30.727000 kernel: acpiphp: Slot [32] registered Aug 13 00:31:30.727007 kernel: acpiphp: Slot [33] registered Aug 13 00:31:30.727013 kernel: acpiphp: Slot [34] registered Aug 13 00:31:30.727019 kernel: acpiphp: Slot [35] registered Aug 13 00:31:30.727025 kernel: acpiphp: Slot [36] registered Aug 13 00:31:30.727031 kernel: acpiphp: Slot [37] registered Aug 13 00:31:30.727037 kernel: acpiphp: Slot [38] registered Aug 13 00:31:30.727044 kernel: acpiphp: Slot [39] registered Aug 13 00:31:30.727049 kernel: acpiphp: Slot [40] registered Aug 13 00:31:30.727057 kernel: acpiphp: Slot [41] registered Aug 13 00:31:30.727063 kernel: acpiphp: Slot [42] registered Aug 13 00:31:30.727069 kernel: acpiphp: Slot [43] registered Aug 13 00:31:30.727075 kernel: acpiphp: Slot [44] registered Aug 13 00:31:30.727081 kernel: acpiphp: Slot [45] registered Aug 13 00:31:30.727087 kernel: acpiphp: Slot [46] registered Aug 13 00:31:30.727093 kernel: acpiphp: Slot [47] registered Aug 13 00:31:30.727099 kernel: acpiphp: Slot [48] registered Aug 13 00:31:30.727105 kernel: acpiphp: Slot [49] registered Aug 13 00:31:30.727111 kernel: acpiphp: Slot [50] registered Aug 13 00:31:30.727118 kernel: acpiphp: Slot [51] registered Aug 13 00:31:30.727125 kernel: acpiphp: Slot [52] registered Aug 13 00:31:30.727131 kernel: acpiphp: Slot [53] registered Aug 13 00:31:30.727137 kernel: acpiphp: Slot [54] registered Aug 13 00:31:30.727143 kernel: acpiphp: Slot [55] registered Aug 13 00:31:30.727155 kernel: acpiphp: Slot [56] registered Aug 13 00:31:30.727161 kernel: acpiphp: Slot [57] registered Aug 13 00:31:30.727168 kernel: acpiphp: Slot [58] registered Aug 13 00:31:30.727173 kernel: acpiphp: Slot [59] registered Aug 13 00:31:30.727181 kernel: acpiphp: Slot [60] registered Aug 13 00:31:30.727187 kernel: acpiphp: Slot [61] registered Aug 13 00:31:30.727193 kernel: acpiphp: Slot [62] registered Aug 13 00:31:30.727199 kernel: acpiphp: Slot [63] registered Aug 13 00:31:30.727251 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Aug 13 00:31:30.727302 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Aug 13 00:31:30.727352 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Aug 13 00:31:30.727402 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Aug 13 00:31:30.727453 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Aug 13 00:31:30.727503 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Aug 13 00:31:30.727559 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Aug 13 00:31:30.727615 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Aug 13 00:31:30.727670 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Aug 13 00:31:30.727722 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Aug 13 00:31:30.727785 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Aug 13 00:31:30.727849 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Aug 13 00:31:30.727904 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Aug 13 00:31:30.727956 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Aug 13 00:31:30.728008 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Aug 13 00:31:30.728060 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Aug 13 00:31:30.728111 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Aug 13 00:31:30.730289 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Aug 13 00:31:30.730416 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Aug 13 00:31:30.730490 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Aug 13 00:31:30.730551 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Aug 13 00:31:30.730605 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Aug 13 00:31:30.730657 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Aug 13 00:31:30.730709 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Aug 13 00:31:30.730760 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Aug 13 00:31:30.730810 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Aug 13 00:31:30.730864 kernel: pci 0000:0b:00.0: supports D1 D2 Aug 13 00:31:30.730915 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Aug 13 00:31:30.730966 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Aug 13 00:31:30.731017 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Aug 13 00:31:30.731069 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Aug 13 00:31:30.731120 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Aug 13 00:31:30.731354 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Aug 13 00:31:30.731412 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Aug 13 00:31:30.731467 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Aug 13 00:31:30.731520 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Aug 13 00:31:30.731573 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Aug 13 00:31:30.731625 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Aug 13 00:31:30.731675 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Aug 13 00:31:30.731726 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Aug 13 00:31:30.731776 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Aug 13 00:31:30.731830 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Aug 13 00:31:30.731881 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Aug 13 00:31:30.731932 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Aug 13 00:31:30.731983 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Aug 13 00:31:30.732033 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Aug 13 00:31:30.732085 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Aug 13 00:31:30.732135 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Aug 13 00:31:30.733564 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Aug 13 00:31:30.733624 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Aug 13 00:31:30.733676 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Aug 13 00:31:30.733728 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Aug 13 00:31:30.733780 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Aug 13 00:31:30.733789 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Aug 13 00:31:30.733796 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Aug 13 00:31:30.733803 kernel: ACPI: PCI: Interrupt link LNKB disabled Aug 13 00:31:30.733811 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Aug 13 00:31:30.733817 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Aug 13 00:31:30.733823 kernel: iommu: Default domain type: Translated Aug 13 00:31:30.733829 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 13 00:31:30.733835 kernel: PCI: Using ACPI for IRQ routing Aug 13 00:31:30.733842 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 13 00:31:30.733848 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Aug 13 00:31:30.733854 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Aug 13 00:31:30.733904 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Aug 13 00:31:30.733958 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Aug 13 00:31:30.734007 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 13 00:31:30.734016 kernel: vgaarb: loaded Aug 13 00:31:30.734023 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Aug 13 00:31:30.734029 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Aug 13 00:31:30.734035 kernel: clocksource: Switched to clocksource tsc-early Aug 13 00:31:30.734042 kernel: VFS: Disk quotas dquot_6.6.0 Aug 13 00:31:30.734048 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 13 00:31:30.734054 kernel: pnp: PnP ACPI init Aug 13 00:31:30.734109 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Aug 13 00:31:30.734174 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Aug 13 00:31:30.734223 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Aug 13 00:31:30.734275 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Aug 13 00:31:30.734324 kernel: pnp 00:06: [dma 2] Aug 13 00:31:30.734374 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Aug 13 00:31:30.734424 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Aug 13 00:31:30.734469 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Aug 13 00:31:30.734477 kernel: pnp: PnP ACPI: found 8 devices Aug 13 00:31:30.734484 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 13 00:31:30.734491 kernel: NET: Registered PF_INET protocol family Aug 13 00:31:30.734498 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 13 00:31:30.734504 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Aug 13 00:31:30.734510 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 13 00:31:30.734518 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Aug 13 00:31:30.734524 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 13 00:31:30.734531 kernel: TCP: Hash tables configured (established 16384 bind 16384) Aug 13 00:31:30.734537 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 13 00:31:30.734543 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 13 00:31:30.734549 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 13 00:31:30.734556 kernel: NET: Registered PF_XDP protocol family Aug 13 00:31:30.734607 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Aug 13 00:31:30.734670 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Aug 13 00:31:30.734742 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Aug 13 00:31:30.734797 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Aug 13 00:31:30.734856 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Aug 13 00:31:30.734909 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Aug 13 00:31:30.734968 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Aug 13 00:31:30.735063 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Aug 13 00:31:30.735123 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Aug 13 00:31:30.735188 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Aug 13 00:31:30.735242 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Aug 13 00:31:30.735296 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Aug 13 00:31:30.735350 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Aug 13 00:31:30.735403 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Aug 13 00:31:30.735456 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Aug 13 00:31:30.735509 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Aug 13 00:31:30.735561 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Aug 13 00:31:30.735617 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Aug 13 00:31:30.735669 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Aug 13 00:31:30.735722 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Aug 13 00:31:30.735774 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Aug 13 00:31:30.735827 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Aug 13 00:31:30.735879 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Aug 13 00:31:30.735932 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Aug 13 00:31:30.737279 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Aug 13 00:31:30.737344 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.737400 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.737452 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.737504 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.737555 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.737606 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.737657 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.737707 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.737761 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.737812 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.737863 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.737912 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.737963 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.738013 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.738064 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.738114 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.739208 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.739272 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.739326 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.739378 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.739431 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.739481 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.739531 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.739585 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.739636 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.739686 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.739736 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.739787 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.739838 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.739889 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.739938 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.739991 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740042 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740092 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740143 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740207 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740275 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740328 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740378 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740431 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740480 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740531 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740581 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740630 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740680 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740731 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740780 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740829 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740882 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.740932 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.740981 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.741029 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.741079 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.741128 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.741484 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.741540 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.741593 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.741649 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.741701 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.741751 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.741801 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.741851 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.741901 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.741950 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742000 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742050 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742100 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742162 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742214 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742263 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742313 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742363 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742413 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742463 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742515 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742565 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742615 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742665 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742716 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742766 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742816 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742865 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.742918 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Aug 13 00:31:30.742968 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Aug 13 00:31:30.743019 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 00:31:30.743071 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Aug 13 00:31:30.743121 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Aug 13 00:31:30.745203 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Aug 13 00:31:30.745267 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Aug 13 00:31:30.745326 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Aug 13 00:31:30.745383 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Aug 13 00:31:30.745435 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Aug 13 00:31:30.745486 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Aug 13 00:31:30.745537 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Aug 13 00:31:30.745589 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Aug 13 00:31:30.745640 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Aug 13 00:31:30.745691 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Aug 13 00:31:30.745741 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Aug 13 00:31:30.745794 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Aug 13 00:31:30.745844 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Aug 13 00:31:30.745898 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Aug 13 00:31:30.745948 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Aug 13 00:31:30.745999 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Aug 13 00:31:30.746050 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Aug 13 00:31:30.746100 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Aug 13 00:31:30.747134 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Aug 13 00:31:30.747223 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Aug 13 00:31:30.747280 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Aug 13 00:31:30.747339 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Aug 13 00:31:30.747392 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Aug 13 00:31:30.747444 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Aug 13 00:31:30.747502 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Aug 13 00:31:30.747554 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Aug 13 00:31:30.747604 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Aug 13 00:31:30.747658 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Aug 13 00:31:30.747711 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Aug 13 00:31:30.747761 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Aug 13 00:31:30.747816 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Aug 13 00:31:30.747870 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Aug 13 00:31:30.747920 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Aug 13 00:31:30.747969 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Aug 13 00:31:30.748019 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Aug 13 00:31:30.748070 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Aug 13 00:31:30.748121 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Aug 13 00:31:30.749220 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Aug 13 00:31:30.749282 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Aug 13 00:31:30.749338 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Aug 13 00:31:30.749391 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Aug 13 00:31:30.749442 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Aug 13 00:31:30.749494 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Aug 13 00:31:30.749546 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Aug 13 00:31:30.749597 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Aug 13 00:31:30.749647 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Aug 13 00:31:30.749704 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Aug 13 00:31:30.749755 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Aug 13 00:31:30.749805 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Aug 13 00:31:30.749858 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Aug 13 00:31:30.749909 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Aug 13 00:31:30.749959 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Aug 13 00:31:30.750011 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Aug 13 00:31:30.750066 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Aug 13 00:31:30.750116 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Aug 13 00:31:30.750345 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Aug 13 00:31:30.750402 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Aug 13 00:31:30.750454 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Aug 13 00:31:30.750508 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Aug 13 00:31:30.750559 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Aug 13 00:31:30.750609 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Aug 13 00:31:30.750663 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Aug 13 00:31:30.750716 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Aug 13 00:31:30.750767 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Aug 13 00:31:30.750817 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Aug 13 00:31:30.750868 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Aug 13 00:31:30.750920 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Aug 13 00:31:30.750969 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Aug 13 00:31:30.751019 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Aug 13 00:31:30.751070 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Aug 13 00:31:30.751125 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Aug 13 00:31:30.751201 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Aug 13 00:31:30.751254 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Aug 13 00:31:30.751306 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Aug 13 00:31:30.751355 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Aug 13 00:31:30.751608 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Aug 13 00:31:30.751665 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Aug 13 00:31:30.751717 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Aug 13 00:31:30.751770 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Aug 13 00:31:30.751822 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Aug 13 00:31:30.751872 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Aug 13 00:31:30.751921 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Aug 13 00:31:30.751973 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Aug 13 00:31:30.752023 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Aug 13 00:31:30.752075 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Aug 13 00:31:30.752129 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Aug 13 00:31:30.752190 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Aug 13 00:31:30.752241 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Aug 13 00:31:30.752291 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Aug 13 00:31:30.752342 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Aug 13 00:31:30.752392 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Aug 13 00:31:30.752442 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Aug 13 00:31:30.752490 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Aug 13 00:31:30.752541 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Aug 13 00:31:30.752593 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Aug 13 00:31:30.752642 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Aug 13 00:31:30.752693 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Aug 13 00:31:30.752743 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Aug 13 00:31:30.752792 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Aug 13 00:31:30.752844 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Aug 13 00:31:30.752894 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Aug 13 00:31:30.752943 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Aug 13 00:31:30.752997 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Aug 13 00:31:30.753047 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Aug 13 00:31:30.753096 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Aug 13 00:31:30.753654 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Aug 13 00:31:30.753715 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Aug 13 00:31:30.753769 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Aug 13 00:31:30.753823 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Aug 13 00:31:30.753878 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Aug 13 00:31:30.753929 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Aug 13 00:31:30.753979 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Aug 13 00:31:30.754024 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Aug 13 00:31:30.754067 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Aug 13 00:31:30.754110 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Aug 13 00:31:30.754172 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Aug 13 00:31:30.754226 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Aug 13 00:31:30.754272 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Aug 13 00:31:30.754317 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Aug 13 00:31:30.754363 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Aug 13 00:31:30.754408 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Aug 13 00:31:30.754456 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Aug 13 00:31:30.754502 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Aug 13 00:31:30.754550 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Aug 13 00:31:30.754600 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Aug 13 00:31:30.754647 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Aug 13 00:31:30.754693 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Aug 13 00:31:30.754742 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Aug 13 00:31:30.754788 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Aug 13 00:31:30.754833 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Aug 13 00:31:30.754885 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Aug 13 00:31:30.754930 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Aug 13 00:31:30.754975 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Aug 13 00:31:30.755024 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Aug 13 00:31:30.755081 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Aug 13 00:31:30.755164 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Aug 13 00:31:30.755214 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Aug 13 00:31:30.755267 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Aug 13 00:31:30.755314 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Aug 13 00:31:30.755363 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Aug 13 00:31:30.755409 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Aug 13 00:31:30.755553 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Aug 13 00:31:30.755602 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Aug 13 00:31:30.755657 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Aug 13 00:31:30.755703 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Aug 13 00:31:30.755748 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Aug 13 00:31:30.755798 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Aug 13 00:31:30.755856 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Aug 13 00:31:30.755905 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Aug 13 00:31:30.755956 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Aug 13 00:31:30.756002 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Aug 13 00:31:30.756046 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Aug 13 00:31:30.756095 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Aug 13 00:31:30.756141 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Aug 13 00:31:30.756235 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Aug 13 00:31:30.756285 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Aug 13 00:31:30.756337 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Aug 13 00:31:30.756383 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Aug 13 00:31:30.756450 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Aug 13 00:31:30.756497 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Aug 13 00:31:30.756546 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Aug 13 00:31:30.756594 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Aug 13 00:31:30.756643 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Aug 13 00:31:30.756689 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Aug 13 00:31:30.756733 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Aug 13 00:31:30.756785 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Aug 13 00:31:30.756830 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Aug 13 00:31:30.756875 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Aug 13 00:31:30.756925 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Aug 13 00:31:30.756971 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Aug 13 00:31:30.757016 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Aug 13 00:31:30.757065 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Aug 13 00:31:30.757111 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Aug 13 00:31:30.757168 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Aug 13 00:31:30.757218 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Aug 13 00:31:30.757269 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Aug 13 00:31:30.757315 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Aug 13 00:31:30.757366 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Aug 13 00:31:30.757412 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Aug 13 00:31:30.757461 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Aug 13 00:31:30.757507 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Aug 13 00:31:30.757558 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Aug 13 00:31:30.757605 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Aug 13 00:31:30.757650 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Aug 13 00:31:30.757700 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Aug 13 00:31:30.757745 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Aug 13 00:31:30.757791 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Aug 13 00:31:30.757845 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Aug 13 00:31:30.757890 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Aug 13 00:31:30.757940 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Aug 13 00:31:30.757985 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Aug 13 00:31:30.758034 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Aug 13 00:31:30.758079 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Aug 13 00:31:30.758128 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Aug 13 00:31:30.758193 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Aug 13 00:31:30.758248 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Aug 13 00:31:30.758294 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Aug 13 00:31:30.758342 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Aug 13 00:31:30.758388 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Aug 13 00:31:30.758441 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Aug 13 00:31:30.758452 kernel: PCI: CLS 32 bytes, default 64 Aug 13 00:31:30.758459 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Aug 13 00:31:30.758466 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Aug 13 00:31:30.758472 kernel: clocksource: Switched to clocksource tsc Aug 13 00:31:30.758478 kernel: Initialise system trusted keyrings Aug 13 00:31:30.758484 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Aug 13 00:31:30.758490 kernel: Key type asymmetric registered Aug 13 00:31:30.758496 kernel: Asymmetric key parser 'x509' registered Aug 13 00:31:30.758502 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 13 00:31:30.758510 kernel: io scheduler mq-deadline registered Aug 13 00:31:30.758516 kernel: io scheduler kyber registered Aug 13 00:31:30.758522 kernel: io scheduler bfq registered Aug 13 00:31:30.758573 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Aug 13 00:31:30.758625 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.758677 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Aug 13 00:31:30.758736 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.758798 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Aug 13 00:31:30.758850 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.758902 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Aug 13 00:31:30.758953 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759004 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Aug 13 00:31:30.759054 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759105 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Aug 13 00:31:30.759170 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759227 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Aug 13 00:31:30.759278 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759331 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Aug 13 00:31:30.759382 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759433 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Aug 13 00:31:30.759484 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759538 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Aug 13 00:31:30.759588 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759639 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Aug 13 00:31:30.759689 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759740 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Aug 13 00:31:30.759790 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759850 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Aug 13 00:31:30.759903 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.759957 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Aug 13 00:31:30.760008 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760059 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Aug 13 00:31:30.760109 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760188 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Aug 13 00:31:30.760244 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760295 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Aug 13 00:31:30.760348 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760398 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Aug 13 00:31:30.760448 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760498 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Aug 13 00:31:30.760548 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760600 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Aug 13 00:31:30.760651 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760702 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Aug 13 00:31:30.760754 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760805 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Aug 13 00:31:30.760856 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.760906 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Aug 13 00:31:30.760957 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761008 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Aug 13 00:31:30.761058 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761111 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Aug 13 00:31:30.761189 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761241 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Aug 13 00:31:30.761292 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761343 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Aug 13 00:31:30.761393 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761443 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Aug 13 00:31:30.761494 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761547 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Aug 13 00:31:30.761598 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761648 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Aug 13 00:31:30.761697 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761747 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Aug 13 00:31:30.761798 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761848 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Aug 13 00:31:30.761900 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Aug 13 00:31:30.761910 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 13 00:31:30.761918 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 13 00:31:30.761925 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 13 00:31:30.761931 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Aug 13 00:31:30.761938 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Aug 13 00:31:30.761945 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Aug 13 00:31:30.761996 kernel: rtc_cmos 00:01: registered as rtc0 Aug 13 00:31:30.762046 kernel: rtc_cmos 00:01: setting system clock to 2025-08-13T00:31:30 UTC (1755045090) Aug 13 00:31:30.762056 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Aug 13 00:31:30.762099 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Aug 13 00:31:30.762108 kernel: intel_pstate: CPU model not supported Aug 13 00:31:30.762114 kernel: NET: Registered PF_INET6 protocol family Aug 13 00:31:30.762121 kernel: Segment Routing with IPv6 Aug 13 00:31:30.762127 kernel: In-situ OAM (IOAM) with IPv6 Aug 13 00:31:30.762134 kernel: NET: Registered PF_PACKET protocol family Aug 13 00:31:30.762142 kernel: Key type dns_resolver registered Aug 13 00:31:30.762155 kernel: IPI shorthand broadcast: enabled Aug 13 00:31:30.762161 kernel: sched_clock: Marking stable (2732003742, 172006864)->(2916246257, -12235651) Aug 13 00:31:30.762167 kernel: registered taskstats version 1 Aug 13 00:31:30.762174 kernel: Loading compiled-in X.509 certificates Aug 13 00:31:30.762181 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.40-flatcar: dee0b464d3f7f8d09744a2392f69dde258bc95c0' Aug 13 00:31:30.762196 kernel: Demotion targets for Node 0: null Aug 13 00:31:30.762204 kernel: Key type .fscrypt registered Aug 13 00:31:30.762210 kernel: Key type fscrypt-provisioning registered Aug 13 00:31:30.762219 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 13 00:31:30.762225 kernel: ima: Allocated hash algorithm: sha1 Aug 13 00:31:30.762232 kernel: ima: No architecture policies found Aug 13 00:31:30.762238 kernel: clk: Disabling unused clocks Aug 13 00:31:30.762245 kernel: Warning: unable to open an initial console. Aug 13 00:31:30.762251 kernel: Freeing unused kernel image (initmem) memory: 54444K Aug 13 00:31:30.762258 kernel: Write protecting the kernel read-only data: 24576k Aug 13 00:31:30.762264 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 13 00:31:30.762271 kernel: Run /init as init process Aug 13 00:31:30.762278 kernel: with arguments: Aug 13 00:31:30.762285 kernel: /init Aug 13 00:31:30.762291 kernel: with environment: Aug 13 00:31:30.762298 kernel: HOME=/ Aug 13 00:31:30.762304 kernel: TERM=linux Aug 13 00:31:30.762310 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 13 00:31:30.762318 systemd[1]: Successfully made /usr/ read-only. Aug 13 00:31:30.762326 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 00:31:30.762335 systemd[1]: Detected virtualization vmware. Aug 13 00:31:30.762341 systemd[1]: Detected architecture x86-64. Aug 13 00:31:30.762347 systemd[1]: Running in initrd. Aug 13 00:31:30.762354 systemd[1]: No hostname configured, using default hostname. Aug 13 00:31:30.762361 systemd[1]: Hostname set to . Aug 13 00:31:30.762368 systemd[1]: Initializing machine ID from random generator. Aug 13 00:31:30.762374 systemd[1]: Queued start job for default target initrd.target. Aug 13 00:31:30.762381 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:31:30.762389 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:31:30.762396 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 13 00:31:30.762403 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:31:30.762409 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 13 00:31:30.762416 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 13 00:31:30.762424 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 13 00:31:30.762432 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 13 00:31:30.762438 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:31:30.762445 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:31:30.762452 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:31:30.762459 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:31:30.762465 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:31:30.762472 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:31:30.762478 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:31:30.762485 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:31:30.762493 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 13 00:31:30.762499 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 13 00:31:30.762506 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:31:30.762513 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:31:30.762519 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:31:30.762526 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:31:30.762532 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 13 00:31:30.762539 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:31:30.762546 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 13 00:31:30.762554 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 13 00:31:30.762560 systemd[1]: Starting systemd-fsck-usr.service... Aug 13 00:31:30.762567 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:31:30.762574 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:31:30.762580 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:31:30.762587 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 13 00:31:30.762595 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:31:30.762614 systemd-journald[244]: Collecting audit messages is disabled. Aug 13 00:31:30.762634 systemd[1]: Finished systemd-fsck-usr.service. Aug 13 00:31:30.762641 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 00:31:30.762648 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:31:30.762655 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:31:30.762662 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 13 00:31:30.762669 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 13 00:31:30.762675 kernel: Bridge firewalling registered Aug 13 00:31:30.762682 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:31:30.762690 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:31:30.762696 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:31:30.762703 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:31:30.762710 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:31:30.762716 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:31:30.762723 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 13 00:31:30.762730 systemd-journald[244]: Journal started Aug 13 00:31:30.762746 systemd-journald[244]: Runtime Journal (/run/log/journal/f9bbf0a6e7e2409b9d3bf3f2833a18f3) is 4.8M, max 38.8M, 34M free. Aug 13 00:31:30.710292 systemd-modules-load[245]: Inserted module 'overlay' Aug 13 00:31:30.736575 systemd-modules-load[245]: Inserted module 'br_netfilter' Aug 13 00:31:30.770630 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:31:30.772045 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:31:30.774823 dracut-cmdline[271]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:31:30.781899 systemd-tmpfiles[287]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 13 00:31:30.784683 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:31:30.785740 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:31:30.816431 systemd-resolved[317]: Positive Trust Anchors: Aug 13 00:31:30.816442 systemd-resolved[317]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:31:30.816465 systemd-resolved[317]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:31:30.818174 systemd-resolved[317]: Defaulting to hostname 'linux'. Aug 13 00:31:30.818730 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:31:30.819079 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:31:30.831161 kernel: SCSI subsystem initialized Aug 13 00:31:30.849170 kernel: Loading iSCSI transport class v2.0-870. Aug 13 00:31:30.858166 kernel: iscsi: registered transport (tcp) Aug 13 00:31:30.883269 kernel: iscsi: registered transport (qla4xxx) Aug 13 00:31:30.883316 kernel: QLogic iSCSI HBA Driver Aug 13 00:31:30.893973 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:31:30.903955 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:31:30.905225 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:31:30.926485 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 13 00:31:30.927278 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 13 00:31:30.964191 kernel: raid6: avx2x4 gen() 45999 MB/s Aug 13 00:31:30.981167 kernel: raid6: avx2x2 gen() 49586 MB/s Aug 13 00:31:30.998564 kernel: raid6: avx2x1 gen() 34480 MB/s Aug 13 00:31:30.998613 kernel: raid6: using algorithm avx2x2 gen() 49586 MB/s Aug 13 00:31:31.016546 kernel: raid6: .... xor() 25891 MB/s, rmw enabled Aug 13 00:31:31.016607 kernel: raid6: using avx2x2 recovery algorithm Aug 13 00:31:31.031159 kernel: xor: automatically using best checksumming function avx Aug 13 00:31:31.141168 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 13 00:31:31.145868 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:31:31.146912 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:31:31.169893 systemd-udevd[492]: Using default interface naming scheme 'v255'. Aug 13 00:31:31.173276 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:31:31.174241 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 13 00:31:31.188388 dracut-pre-trigger[498]: rd.md=0: removing MD RAID activation Aug 13 00:31:31.203209 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:31:31.203985 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:31:31.288785 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:31:31.290507 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 13 00:31:31.380162 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Aug 13 00:31:31.382574 kernel: VMware PVSCSI driver - version 1.0.7.0-k Aug 13 00:31:31.382595 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Aug 13 00:31:31.382703 kernel: vmw_pvscsi: using 64bit dma Aug 13 00:31:31.385155 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Aug 13 00:31:31.390478 kernel: vmw_pvscsi: max_id: 16 Aug 13 00:31:31.390506 kernel: vmw_pvscsi: setting ring_pages to 8 Aug 13 00:31:31.393272 kernel: vmw_pvscsi: enabling reqCallThreshold Aug 13 00:31:31.393296 kernel: vmw_pvscsi: driver-based request coalescing enabled Aug 13 00:31:31.393305 kernel: vmw_pvscsi: using MSI-X Aug 13 00:31:31.397163 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Aug 13 00:31:31.401156 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Aug 13 00:31:31.403215 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Aug 13 00:31:31.411155 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Aug 13 00:31:31.416218 (udev-worker)[545]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Aug 13 00:31:31.418173 kernel: cryptd: max_cpu_qlen set to 1000 Aug 13 00:31:31.420749 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:31:31.421068 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:31:31.421537 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:31:31.424218 kernel: libata version 3.00 loaded. Aug 13 00:31:31.424470 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:31:31.431186 kernel: AES CTR mode by8 optimization enabled Aug 13 00:31:31.434254 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Aug 13 00:31:31.434355 kernel: sd 0:0:0:0: [sda] Write Protect is off Aug 13 00:31:31.434431 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Aug 13 00:31:31.434493 kernel: sd 0:0:0:0: [sda] Cache data unavailable Aug 13 00:31:31.434555 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Aug 13 00:31:31.435261 kernel: ata_piix 0000:00:07.1: version 2.13 Aug 13 00:31:31.436154 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Aug 13 00:31:31.438216 kernel: scsi host1: ata_piix Aug 13 00:31:31.438298 kernel: scsi host2: ata_piix Aug 13 00:31:31.438360 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Aug 13 00:31:31.438369 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Aug 13 00:31:31.451365 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:31:31.451406 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Aug 13 00:31:31.461123 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:31:31.603172 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Aug 13 00:31:31.609169 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Aug 13 00:31:31.631180 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Aug 13 00:31:31.631302 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 13 00:31:31.639159 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Aug 13 00:31:31.653000 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Aug 13 00:31:31.658455 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Aug 13 00:31:31.664022 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Aug 13 00:31:31.668409 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Aug 13 00:31:31.668653 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Aug 13 00:31:31.669329 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 13 00:31:31.713226 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:31:31.932399 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 13 00:31:31.932768 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:31:31.932904 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:31:31.933107 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:31:31.933796 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 13 00:31:31.946652 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:31:32.732191 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:31:32.732287 disk-uuid[647]: The operation has completed successfully. Aug 13 00:31:32.814015 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 13 00:31:32.814102 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 13 00:31:32.832291 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 13 00:31:32.846024 sh[677]: Success Aug 13 00:31:32.870227 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 13 00:31:32.870257 kernel: device-mapper: uevent: version 1.0.3 Aug 13 00:31:32.870271 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 13 00:31:32.878162 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Aug 13 00:31:32.921884 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 13 00:31:32.924187 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 13 00:31:32.933116 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 13 00:31:32.975645 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Aug 13 00:31:32.975683 kernel: BTRFS: device fsid 0c0338fb-9434-41c1-99a2-737cbe2351c4 devid 1 transid 44 /dev/mapper/usr (254:0) scanned by mount (689) Aug 13 00:31:32.980055 kernel: BTRFS info (device dm-0): first mount of filesystem 0c0338fb-9434-41c1-99a2-737cbe2351c4 Aug 13 00:31:32.980082 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:31:32.980094 kernel: BTRFS info (device dm-0): using free-space-tree Aug 13 00:31:32.988876 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 13 00:31:32.989251 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 13 00:31:32.989914 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Aug 13 00:31:32.991227 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 13 00:31:33.027236 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (712) Aug 13 00:31:33.038520 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:31:33.038540 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:31:33.038549 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:31:33.066162 kernel: BTRFS info (device sda6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:31:33.067121 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 13 00:31:33.068310 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 13 00:31:33.102332 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Aug 13 00:31:33.103278 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 13 00:31:33.168362 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:31:33.173242 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:31:33.200944 systemd-networkd[863]: lo: Link UP Aug 13 00:31:33.201206 systemd-networkd[863]: lo: Gained carrier Aug 13 00:31:33.202059 systemd-networkd[863]: Enumeration completed Aug 13 00:31:33.202225 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:31:33.202599 systemd[1]: Reached target network.target - Network. Aug 13 00:31:33.203051 systemd-networkd[863]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Aug 13 00:31:33.205397 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Aug 13 00:31:33.205566 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Aug 13 00:31:33.206569 systemd-networkd[863]: ens192: Link UP Aug 13 00:31:33.206574 systemd-networkd[863]: ens192: Gained carrier Aug 13 00:31:33.215010 ignition[731]: Ignition 2.21.0 Aug 13 00:31:33.215215 ignition[731]: Stage: fetch-offline Aug 13 00:31:33.215334 ignition[731]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:31:33.215446 ignition[731]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Aug 13 00:31:33.215615 ignition[731]: parsed url from cmdline: "" Aug 13 00:31:33.215641 ignition[731]: no config URL provided Aug 13 00:31:33.215734 ignition[731]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 00:31:33.215855 ignition[731]: no config at "/usr/lib/ignition/user.ign" Aug 13 00:31:33.216336 ignition[731]: config successfully fetched Aug 13 00:31:33.216383 ignition[731]: parsing config with SHA512: 6dea355eac80efe70b9380a4eda820a5f9bc81bdd4ed8ca8ac4b710f401b772fe9edb9cb2edf59bc6ec79e8489cc5d3ab8b5562ea20bf4812dc56635ebb3c932 Aug 13 00:31:33.220326 unknown[731]: fetched base config from "system" Aug 13 00:31:33.220334 unknown[731]: fetched user config from "vmware" Aug 13 00:31:33.220618 ignition[731]: fetch-offline: fetch-offline passed Aug 13 00:31:33.220657 ignition[731]: Ignition finished successfully Aug 13 00:31:33.221639 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:31:33.221862 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Aug 13 00:31:33.222320 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 13 00:31:33.236539 ignition[873]: Ignition 2.21.0 Aug 13 00:31:33.236751 ignition[873]: Stage: kargs Aug 13 00:31:33.236841 ignition[873]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:31:33.236848 ignition[873]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Aug 13 00:31:33.238015 ignition[873]: kargs: kargs passed Aug 13 00:31:33.238062 ignition[873]: Ignition finished successfully Aug 13 00:31:33.239822 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 13 00:31:33.240837 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 13 00:31:33.260426 ignition[879]: Ignition 2.21.0 Aug 13 00:31:33.260434 ignition[879]: Stage: disks Aug 13 00:31:33.261609 ignition[879]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:31:33.261843 ignition[879]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Aug 13 00:31:33.262645 ignition[879]: disks: disks passed Aug 13 00:31:33.262673 ignition[879]: Ignition finished successfully Aug 13 00:31:33.263832 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 13 00:31:33.264201 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 13 00:31:33.264455 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 13 00:31:33.264732 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:31:33.264969 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:31:33.265230 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:31:33.265985 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 13 00:31:33.282188 systemd-fsck[887]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Aug 13 00:31:33.282913 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 13 00:31:33.283825 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 13 00:31:33.450158 kernel: EXT4-fs (sda9): mounted filesystem 069caac6-7833-4acd-8940-01a7ff7d1281 r/w with ordered data mode. Quota mode: none. Aug 13 00:31:33.450407 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 13 00:31:33.450804 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 13 00:31:33.451780 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:31:33.453186 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 13 00:31:33.453494 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Aug 13 00:31:33.453519 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 13 00:31:33.453533 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:31:33.470421 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 13 00:31:33.471349 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 13 00:31:33.505161 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (896) Aug 13 00:31:33.513193 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:31:33.513224 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:31:33.515530 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:31:33.519563 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:31:33.536868 initrd-setup-root[920]: cut: /sysroot/etc/passwd: No such file or directory Aug 13 00:31:33.539525 initrd-setup-root[927]: cut: /sysroot/etc/group: No such file or directory Aug 13 00:31:33.542322 initrd-setup-root[934]: cut: /sysroot/etc/shadow: No such file or directory Aug 13 00:31:33.544781 initrd-setup-root[941]: cut: /sysroot/etc/gshadow: No such file or directory Aug 13 00:31:33.656616 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 13 00:31:33.657289 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 13 00:31:33.658238 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 13 00:31:33.667157 kernel: BTRFS info (device sda6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:31:33.683603 ignition[1009]: INFO : Ignition 2.21.0 Aug 13 00:31:33.683603 ignition[1009]: INFO : Stage: mount Aug 13 00:31:33.684258 ignition[1009]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:31:33.684258 ignition[1009]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Aug 13 00:31:33.684498 ignition[1009]: INFO : mount: mount passed Aug 13 00:31:33.684498 ignition[1009]: INFO : Ignition finished successfully Aug 13 00:31:33.685264 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 13 00:31:33.686207 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 13 00:31:33.720749 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 13 00:31:33.875130 systemd-resolved[317]: Detected conflict on linux IN A 139.178.70.101 Aug 13 00:31:33.875140 systemd-resolved[317]: Hostname conflict, changing published hostname from 'linux' to 'linux4'. Aug 13 00:31:33.972927 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 13 00:31:33.973862 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:31:33.995182 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1020) Aug 13 00:31:33.997654 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:31:33.997695 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:31:33.997708 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:31:34.002301 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:31:34.019798 ignition[1038]: INFO : Ignition 2.21.0 Aug 13 00:31:34.019798 ignition[1038]: INFO : Stage: files Aug 13 00:31:34.020196 ignition[1038]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:31:34.020196 ignition[1038]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Aug 13 00:31:34.020482 ignition[1038]: DEBUG : files: compiled without relabeling support, skipping Aug 13 00:31:34.027000 ignition[1038]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 13 00:31:34.027000 ignition[1038]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 13 00:31:34.038230 ignition[1038]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 13 00:31:34.038381 ignition[1038]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 13 00:31:34.038518 ignition[1038]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 13 00:31:34.038454 unknown[1038]: wrote ssh authorized keys file for user: core Aug 13 00:31:34.047604 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 13 00:31:34.047808 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Aug 13 00:31:34.094340 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 13 00:31:34.236250 systemd-networkd[863]: ens192: Gained IPv6LL Aug 13 00:31:34.472998 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 13 00:31:34.473348 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 13 00:31:34.473348 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 13 00:31:34.473348 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:31:34.473348 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:31:34.473348 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:31:34.473348 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:31:34.473348 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:31:34.474660 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:31:34.474660 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:31:34.474660 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:31:34.474660 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 13 00:31:34.476789 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 13 00:31:34.476789 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 13 00:31:34.477217 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Aug 13 00:31:35.015637 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 13 00:31:35.618053 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 13 00:31:35.618053 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Aug 13 00:31:35.624269 ignition[1038]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Aug 13 00:31:35.624269 ignition[1038]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Aug 13 00:31:35.627630 ignition[1038]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:31:35.628808 ignition[1038]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:31:35.628808 ignition[1038]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Aug 13 00:31:35.628808 ignition[1038]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Aug 13 00:31:35.629265 ignition[1038]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Aug 13 00:31:35.629265 ignition[1038]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Aug 13 00:31:35.629265 ignition[1038]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Aug 13 00:31:35.629265 ignition[1038]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Aug 13 00:31:35.674790 ignition[1038]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Aug 13 00:31:35.677810 ignition[1038]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Aug 13 00:31:35.678117 ignition[1038]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Aug 13 00:31:35.678117 ignition[1038]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Aug 13 00:31:35.678117 ignition[1038]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Aug 13 00:31:35.678117 ignition[1038]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:31:35.679797 ignition[1038]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:31:35.679797 ignition[1038]: INFO : files: files passed Aug 13 00:31:35.679797 ignition[1038]: INFO : Ignition finished successfully Aug 13 00:31:35.679333 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 13 00:31:35.681238 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 13 00:31:35.682256 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 13 00:31:35.688605 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 13 00:31:35.688670 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 13 00:31:35.691883 initrd-setup-root-after-ignition[1068]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:31:35.691883 initrd-setup-root-after-ignition[1068]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:31:35.692979 initrd-setup-root-after-ignition[1072]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:31:35.693715 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:31:35.693918 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 13 00:31:35.694474 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 13 00:31:35.716419 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 13 00:31:35.716495 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 13 00:31:35.716759 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 13 00:31:35.717008 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 13 00:31:35.717214 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 13 00:31:35.717656 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 13 00:31:35.725856 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:31:35.726772 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 13 00:31:35.736801 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:31:35.736982 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:31:35.737268 systemd[1]: Stopped target timers.target - Timer Units. Aug 13 00:31:35.737465 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 13 00:31:35.737534 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:31:35.737897 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 13 00:31:35.738056 systemd[1]: Stopped target basic.target - Basic System. Aug 13 00:31:35.738264 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 13 00:31:35.738455 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:31:35.738659 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 13 00:31:35.738864 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 13 00:31:35.739068 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 13 00:31:35.739271 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:31:35.739486 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 13 00:31:35.739695 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 13 00:31:35.739882 systemd[1]: Stopped target swap.target - Swaps. Aug 13 00:31:35.740044 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 13 00:31:35.740109 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:31:35.740387 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:31:35.740624 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:31:35.740808 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 13 00:31:35.740856 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:31:35.741009 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 13 00:31:35.741069 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 13 00:31:35.741343 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 13 00:31:35.741405 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:31:35.741627 systemd[1]: Stopped target paths.target - Path Units. Aug 13 00:31:35.741761 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 13 00:31:35.745169 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:31:35.745345 systemd[1]: Stopped target slices.target - Slice Units. Aug 13 00:31:35.745558 systemd[1]: Stopped target sockets.target - Socket Units. Aug 13 00:31:35.745753 systemd[1]: iscsid.socket: Deactivated successfully. Aug 13 00:31:35.745819 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:31:35.746046 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 13 00:31:35.746104 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:31:35.746356 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 13 00:31:35.746444 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:31:35.746681 systemd[1]: ignition-files.service: Deactivated successfully. Aug 13 00:31:35.746757 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 13 00:31:35.747450 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 13 00:31:35.749206 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 13 00:31:35.749371 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 13 00:31:35.749463 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:31:35.749725 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 13 00:31:35.749802 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:31:35.751981 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 13 00:31:35.758191 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 13 00:31:35.765559 ignition[1092]: INFO : Ignition 2.21.0 Aug 13 00:31:35.765559 ignition[1092]: INFO : Stage: umount Aug 13 00:31:35.765939 ignition[1092]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:31:35.765939 ignition[1092]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Aug 13 00:31:35.766227 ignition[1092]: INFO : umount: umount passed Aug 13 00:31:35.766788 ignition[1092]: INFO : Ignition finished successfully Aug 13 00:31:35.767135 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 13 00:31:35.767295 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 13 00:31:35.767507 systemd[1]: Stopped target network.target - Network. Aug 13 00:31:35.767627 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 13 00:31:35.767655 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 13 00:31:35.767865 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 13 00:31:35.767888 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 13 00:31:35.768280 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 13 00:31:35.768302 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 13 00:31:35.768455 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 13 00:31:35.768476 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 13 00:31:35.768682 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 13 00:31:35.768827 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 13 00:31:35.775294 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 13 00:31:35.775358 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 13 00:31:35.776805 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 13 00:31:35.776952 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 13 00:31:35.776975 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:31:35.777852 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 13 00:31:35.778644 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 13 00:31:35.779569 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 13 00:31:35.779630 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 13 00:31:35.780482 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 13 00:31:35.780606 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 13 00:31:35.780744 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 13 00:31:35.780762 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:31:35.781373 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 13 00:31:35.781474 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 13 00:31:35.781500 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:31:35.781631 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Aug 13 00:31:35.781653 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Aug 13 00:31:35.781783 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 13 00:31:35.781804 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:31:35.782993 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 13 00:31:35.783019 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 13 00:31:35.783547 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:31:35.784793 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 13 00:31:35.794438 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 13 00:31:35.794518 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 13 00:31:35.795401 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 13 00:31:35.795489 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:31:35.795838 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 13 00:31:35.795872 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 13 00:31:35.795981 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 13 00:31:35.795999 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:31:35.796170 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 13 00:31:35.796194 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:31:35.796468 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 13 00:31:35.796492 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 13 00:31:35.796835 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 13 00:31:35.796858 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:31:35.798281 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 13 00:31:35.798389 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 13 00:31:35.798421 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:31:35.799007 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 13 00:31:35.799033 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:31:35.799482 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Aug 13 00:31:35.799507 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:31:35.799964 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 13 00:31:35.799989 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:31:35.800384 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:31:35.800519 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:31:35.809564 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 13 00:31:35.809640 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 13 00:31:35.889523 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 13 00:31:35.889598 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 13 00:31:35.889884 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 13 00:31:35.890005 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 13 00:31:35.890030 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 13 00:31:35.890637 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 13 00:31:35.899904 systemd[1]: Switching root. Aug 13 00:31:35.944484 systemd-journald[244]: Journal stopped Aug 13 00:31:37.791678 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Aug 13 00:31:37.791699 kernel: SELinux: policy capability network_peer_controls=1 Aug 13 00:31:37.791707 kernel: SELinux: policy capability open_perms=1 Aug 13 00:31:37.791713 kernel: SELinux: policy capability extended_socket_class=1 Aug 13 00:31:37.791718 kernel: SELinux: policy capability always_check_network=0 Aug 13 00:31:37.791725 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 13 00:31:37.791731 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 13 00:31:37.791737 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 13 00:31:37.791744 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 13 00:31:37.791749 kernel: SELinux: policy capability userspace_initial_context=0 Aug 13 00:31:37.791755 kernel: audit: type=1403 audit(1755045097.209:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 13 00:31:37.791761 systemd[1]: Successfully loaded SELinux policy in 85.321ms. Aug 13 00:31:37.791770 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.461ms. Aug 13 00:31:37.791777 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 00:31:37.791784 systemd[1]: Detected virtualization vmware. Aug 13 00:31:37.791791 systemd[1]: Detected architecture x86-64. Aug 13 00:31:37.791798 systemd[1]: Detected first boot. Aug 13 00:31:37.791805 systemd[1]: Initializing machine ID from random generator. Aug 13 00:31:37.791811 zram_generator::config[1135]: No configuration found. Aug 13 00:31:37.795471 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Aug 13 00:31:37.795494 kernel: Guest personality initialized and is active Aug 13 00:31:37.795502 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Aug 13 00:31:37.795508 kernel: Initialized host personality Aug 13 00:31:37.795517 kernel: NET: Registered PF_VSOCK protocol family Aug 13 00:31:37.795525 systemd[1]: Populated /etc with preset unit settings. Aug 13 00:31:37.795534 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Aug 13 00:31:37.795541 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Aug 13 00:31:37.795548 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 13 00:31:37.795555 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 13 00:31:37.795561 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 13 00:31:37.795569 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 13 00:31:37.795576 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 13 00:31:37.795583 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 13 00:31:37.795590 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 13 00:31:37.795597 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 13 00:31:37.795603 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 13 00:31:37.795610 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 13 00:31:37.795618 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 13 00:31:37.795625 systemd[1]: Created slice user.slice - User and Session Slice. Aug 13 00:31:37.795631 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:31:37.795640 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:31:37.795647 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 13 00:31:37.795654 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 13 00:31:37.795661 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 13 00:31:37.795668 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:31:37.795677 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Aug 13 00:31:37.795684 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:31:37.795690 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:31:37.795698 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 13 00:31:37.795705 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 13 00:31:37.795712 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 13 00:31:37.795718 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 13 00:31:37.795725 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:31:37.795733 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:31:37.795740 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:31:37.795747 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:31:37.795754 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 13 00:31:37.795761 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 13 00:31:37.795769 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 13 00:31:37.795776 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:31:37.795783 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:31:37.795792 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:31:37.795802 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 13 00:31:37.795812 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 13 00:31:37.795823 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 13 00:31:37.795831 systemd[1]: Mounting media.mount - External Media Directory... Aug 13 00:31:37.795840 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:31:37.795847 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 13 00:31:37.795854 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 13 00:31:37.795861 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 13 00:31:37.795868 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 13 00:31:37.795875 systemd[1]: Reached target machines.target - Containers. Aug 13 00:31:37.795882 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 13 00:31:37.795889 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Aug 13 00:31:37.795897 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:31:37.795904 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 13 00:31:37.795911 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:31:37.795918 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:31:37.795924 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:31:37.795931 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 13 00:31:37.795938 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:31:37.795945 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 13 00:31:37.795953 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 13 00:31:37.795961 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 13 00:31:37.795968 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 13 00:31:37.795975 systemd[1]: Stopped systemd-fsck-usr.service. Aug 13 00:31:37.795983 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:31:37.795990 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:31:37.795997 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:31:37.796004 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:31:37.796011 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 13 00:31:37.796019 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 13 00:31:37.796026 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:31:37.796033 systemd[1]: verity-setup.service: Deactivated successfully. Aug 13 00:31:37.796040 systemd[1]: Stopped verity-setup.service. Aug 13 00:31:37.796047 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:31:37.796054 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 13 00:31:37.796061 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 13 00:31:37.796069 systemd[1]: Mounted media.mount - External Media Directory. Aug 13 00:31:37.796077 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 13 00:31:37.796085 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 13 00:31:37.796091 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 13 00:31:37.796100 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:31:37.796109 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 13 00:31:37.796116 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 13 00:31:37.796123 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:31:37.796130 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:31:37.796137 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:31:37.796172 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:31:37.796186 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:31:37.796194 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:31:37.796202 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 13 00:31:37.796209 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:31:37.796215 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 13 00:31:37.796222 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 13 00:31:37.796229 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:31:37.796239 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 13 00:31:37.796246 kernel: loop: module loaded Aug 13 00:31:37.796256 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 13 00:31:37.796264 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:31:37.796271 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 13 00:31:37.796281 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:31:37.796318 systemd-journald[1228]: Collecting audit messages is disabled. Aug 13 00:31:37.796338 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 13 00:31:37.796347 systemd-journald[1228]: Journal started Aug 13 00:31:37.796367 systemd-journald[1228]: Runtime Journal (/run/log/journal/558c020bf14a41599406ba5c73fe1564) is 4.8M, max 38.8M, 34M free. Aug 13 00:31:37.581452 systemd[1]: Queued start job for default target multi-user.target. Aug 13 00:31:37.594504 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Aug 13 00:31:37.594769 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 13 00:31:37.796906 jq[1205]: true Aug 13 00:31:37.811195 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:31:37.811245 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 13 00:31:37.811256 kernel: fuse: init (API version 7.41) Aug 13 00:31:37.811273 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 00:31:37.811284 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:31:37.811781 jq[1247]: true Aug 13 00:31:37.812041 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 13 00:31:37.812568 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 13 00:31:37.812682 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 13 00:31:37.812911 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:31:37.813008 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:31:37.814362 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 13 00:31:37.819370 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 13 00:31:37.826252 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 13 00:31:37.835743 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 13 00:31:37.838396 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 13 00:31:37.840628 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 13 00:31:37.851301 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 13 00:31:37.851479 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:31:37.853773 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 13 00:31:37.878623 kernel: loop0: detected capacity change from 0 to 221472 Aug 13 00:31:37.896748 kernel: ACPI: bus type drm_connector registered Aug 13 00:31:37.896489 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:31:37.898622 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:31:37.898751 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:31:37.900137 systemd-journald[1228]: Time spent on flushing to /var/log/journal/558c020bf14a41599406ba5c73fe1564 is 50.897ms for 1763 entries. Aug 13 00:31:37.900137 systemd-journald[1228]: System Journal (/var/log/journal/558c020bf14a41599406ba5c73fe1564) is 8M, max 584.8M, 576.8M free. Aug 13 00:31:37.972451 systemd-journald[1228]: Received client request to flush runtime journal. Aug 13 00:31:37.922073 systemd-tmpfiles[1258]: ACLs are not supported, ignoring. Aug 13 00:31:37.914326 ignition[1259]: Ignition 2.21.0 Aug 13 00:31:37.922083 systemd-tmpfiles[1258]: ACLs are not supported, ignoring. Aug 13 00:31:37.914787 ignition[1259]: deleting config from guestinfo properties Aug 13 00:31:37.925791 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Aug 13 00:31:37.924418 ignition[1259]: Successfully deleted config Aug 13 00:31:37.926781 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:31:37.935402 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 13 00:31:37.940552 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 13 00:31:37.958901 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:31:37.974369 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 13 00:31:37.979164 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 13 00:31:38.017176 kernel: loop1: detected capacity change from 0 to 146240 Aug 13 00:31:38.017917 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 13 00:31:38.021229 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:31:38.033463 systemd-tmpfiles[1306]: ACLs are not supported, ignoring. Aug 13 00:31:38.033477 systemd-tmpfiles[1306]: ACLs are not supported, ignoring. Aug 13 00:31:38.036700 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:31:38.065173 kernel: loop2: detected capacity change from 0 to 113872 Aug 13 00:31:38.133162 kernel: loop3: detected capacity change from 0 to 2960 Aug 13 00:31:38.167171 kernel: loop4: detected capacity change from 0 to 221472 Aug 13 00:31:38.356164 kernel: loop5: detected capacity change from 0 to 146240 Aug 13 00:31:38.398166 kernel: loop6: detected capacity change from 0 to 113872 Aug 13 00:31:38.426164 kernel: loop7: detected capacity change from 0 to 2960 Aug 13 00:31:38.447974 (sd-merge)[1312]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Aug 13 00:31:38.448473 (sd-merge)[1312]: Merged extensions into '/usr'. Aug 13 00:31:38.452210 systemd[1]: Reload requested from client PID 1257 ('systemd-sysext') (unit systemd-sysext.service)... Aug 13 00:31:38.452219 systemd[1]: Reloading... Aug 13 00:31:38.511205 zram_generator::config[1341]: No configuration found. Aug 13 00:31:38.583351 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:31:38.592808 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Aug 13 00:31:38.637815 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 13 00:31:38.638203 systemd[1]: Reloading finished in 185 ms. Aug 13 00:31:38.654130 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 13 00:31:38.658057 systemd[1]: Starting ensure-sysext.service... Aug 13 00:31:38.660949 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:31:38.682546 systemd[1]: Reload requested from client PID 1393 ('systemctl') (unit ensure-sysext.service)... Aug 13 00:31:38.682560 systemd[1]: Reloading... Aug 13 00:31:38.702636 systemd-tmpfiles[1394]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 13 00:31:38.702662 systemd-tmpfiles[1394]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 13 00:31:38.702852 systemd-tmpfiles[1394]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 13 00:31:38.703009 systemd-tmpfiles[1394]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 13 00:31:38.703523 systemd-tmpfiles[1394]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 13 00:31:38.703688 systemd-tmpfiles[1394]: ACLs are not supported, ignoring. Aug 13 00:31:38.703725 systemd-tmpfiles[1394]: ACLs are not supported, ignoring. Aug 13 00:31:38.715820 systemd-tmpfiles[1394]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:31:38.715828 systemd-tmpfiles[1394]: Skipping /boot Aug 13 00:31:38.734312 systemd-tmpfiles[1394]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:31:38.734319 systemd-tmpfiles[1394]: Skipping /boot Aug 13 00:31:38.740172 zram_generator::config[1422]: No configuration found. Aug 13 00:31:38.824805 ldconfig[1253]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 13 00:31:38.833372 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:31:38.841311 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Aug 13 00:31:38.885981 systemd[1]: Reloading finished in 203 ms. Aug 13 00:31:38.905079 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 13 00:31:38.905414 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 13 00:31:38.908248 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:31:38.914222 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 00:31:38.915795 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 13 00:31:38.917362 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 13 00:31:38.924300 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:31:38.927285 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:31:38.928398 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 13 00:31:38.932434 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:31:38.934601 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:31:38.937074 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:31:38.947777 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:31:38.948022 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:31:38.948134 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:31:38.948296 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:31:38.948905 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:31:38.949650 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:31:38.955525 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 13 00:31:38.962354 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 13 00:31:38.963730 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:31:38.965404 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:31:38.968702 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:31:38.969108 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:31:38.969206 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:31:38.971579 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 13 00:31:38.974501 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 13 00:31:38.974679 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:31:38.975492 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:31:38.976121 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:31:38.976549 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:31:38.976761 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:31:38.978381 systemd[1]: Finished ensure-sysext.service. Aug 13 00:31:38.980550 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:31:38.987021 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 13 00:31:38.991112 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:31:38.991285 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:31:38.991838 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:31:38.994350 systemd-udevd[1485]: Using default interface naming scheme 'v255'. Aug 13 00:31:38.994458 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:31:38.994752 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:31:39.001159 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 13 00:31:39.008117 augenrules[1524]: No rules Aug 13 00:31:39.009456 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:31:39.009708 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 00:31:39.015395 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 13 00:31:39.016120 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 13 00:31:39.022002 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 13 00:31:39.027767 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:31:39.031328 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:31:39.125094 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 13 00:31:39.125283 systemd[1]: Reached target time-set.target - System Time Set. Aug 13 00:31:39.158673 systemd-networkd[1540]: lo: Link UP Aug 13 00:31:39.159167 systemd-networkd[1540]: lo: Gained carrier Aug 13 00:31:39.161077 systemd-networkd[1540]: Enumeration completed Aug 13 00:31:39.161251 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:31:39.162260 systemd-networkd[1540]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Aug 13 00:31:39.162364 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 13 00:31:39.164105 systemd-resolved[1484]: Positive Trust Anchors: Aug 13 00:31:39.165919 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Aug 13 00:31:39.166067 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Aug 13 00:31:39.164885 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 13 00:31:39.165329 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Aug 13 00:31:39.166258 systemd-resolved[1484]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:31:39.166315 systemd-resolved[1484]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:31:39.168040 systemd-networkd[1540]: ens192: Link UP Aug 13 00:31:39.168245 systemd-networkd[1540]: ens192: Gained carrier Aug 13 00:31:39.169986 systemd-resolved[1484]: Defaulting to hostname 'linux'. Aug 13 00:31:39.171258 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:31:39.171433 systemd[1]: Reached target network.target - Network. Aug 13 00:31:39.171529 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:31:39.171646 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:31:39.171800 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 13 00:31:39.171930 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 13 00:31:39.172045 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 13 00:31:39.172241 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 13 00:31:39.172386 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 13 00:31:39.172499 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 13 00:31:39.172619 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 13 00:31:39.172636 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:31:39.172728 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:31:39.174609 systemd-timesyncd[1515]: Network configuration changed, trying to establish connection. Aug 13 00:31:39.174982 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 13 00:31:39.177374 systemd-timesyncd[1515]: Network configuration changed, trying to establish connection. Aug 13 00:31:39.178918 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 13 00:31:39.182310 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 13 00:31:39.182536 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 13 00:31:39.182655 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 13 00:31:39.184390 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 13 00:31:39.185404 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 13 00:31:39.185978 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 13 00:31:39.187823 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:31:39.189194 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:31:39.189339 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:31:39.189355 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:31:39.192994 systemd[1]: Starting containerd.service - containerd container runtime... Aug 13 00:31:39.195960 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 13 00:31:39.199614 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 13 00:31:39.203435 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 13 00:31:39.205314 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 13 00:31:39.205450 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 13 00:31:39.209649 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 13 00:31:39.214090 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 13 00:31:39.218889 jq[1582]: false Aug 13 00:31:39.218296 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 13 00:31:39.220928 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 13 00:31:39.223749 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 13 00:31:39.226540 extend-filesystems[1583]: Found /dev/sda6 Aug 13 00:31:39.229561 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 13 00:31:39.230212 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 13 00:31:39.230743 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 13 00:31:39.231435 extend-filesystems[1583]: Found /dev/sda9 Aug 13 00:31:39.235673 extend-filesystems[1583]: Checking size of /dev/sda9 Aug 13 00:31:39.236432 systemd[1]: Starting update-engine.service - Update Engine... Aug 13 00:31:39.241506 google_oslogin_nss_cache[1584]: oslogin_cache_refresh[1584]: Refreshing passwd entry cache Aug 13 00:31:39.241057 oslogin_cache_refresh[1584]: Refreshing passwd entry cache Aug 13 00:31:39.242718 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 13 00:31:39.247519 extend-filesystems[1583]: Old size kept for /dev/sda9 Aug 13 00:31:39.249138 google_oslogin_nss_cache[1584]: oslogin_cache_refresh[1584]: Failure getting users, quitting Aug 13 00:31:39.249138 google_oslogin_nss_cache[1584]: oslogin_cache_refresh[1584]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 00:31:39.249138 google_oslogin_nss_cache[1584]: oslogin_cache_refresh[1584]: Refreshing group entry cache Aug 13 00:31:39.249000 oslogin_cache_refresh[1584]: Failure getting users, quitting Aug 13 00:31:39.249013 oslogin_cache_refresh[1584]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 00:31:39.249043 oslogin_cache_refresh[1584]: Refreshing group entry cache Aug 13 00:31:39.250278 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Aug 13 00:31:39.252176 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 13 00:31:39.252991 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 13 00:31:39.253258 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 13 00:31:39.253403 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 13 00:31:39.253551 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 13 00:31:39.253664 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 13 00:31:39.254696 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 13 00:31:39.254820 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 13 00:31:39.263700 google_oslogin_nss_cache[1584]: oslogin_cache_refresh[1584]: Failure getting groups, quitting Aug 13 00:31:39.263700 google_oslogin_nss_cache[1584]: oslogin_cache_refresh[1584]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 00:31:39.262842 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 13 00:31:39.262137 oslogin_cache_refresh[1584]: Failure getting groups, quitting Aug 13 00:31:39.262247 oslogin_cache_refresh[1584]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 00:31:39.266869 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 13 00:31:39.270888 jq[1597]: true Aug 13 00:31:39.284277 update_engine[1594]: I20250813 00:31:39.283855 1594 main.cc:92] Flatcar Update Engine starting Aug 13 00:31:39.287168 tar[1606]: linux-amd64/helm Aug 13 00:31:39.292880 jq[1620]: true Aug 13 00:31:39.292475 (ntainerd)[1618]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 13 00:31:39.304990 dbus-daemon[1580]: [system] SELinux support is enabled Aug 13 00:31:39.305098 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 13 00:31:39.306855 systemd[1]: motdgen.service: Deactivated successfully. Aug 13 00:31:39.309232 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 13 00:31:39.309890 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 13 00:31:39.309930 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 13 00:31:39.310369 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 13 00:31:39.310381 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 13 00:31:39.317409 systemd[1]: Started update-engine.service - Update Engine. Aug 13 00:31:39.318854 update_engine[1594]: I20250813 00:31:39.317502 1594 update_check_scheduler.cc:74] Next update check in 7m12s Aug 13 00:31:39.321247 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 13 00:31:39.357522 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Aug 13 00:31:39.360794 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Aug 13 00:31:39.363335 bash[1644]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:31:39.364285 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 13 00:31:39.364700 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Aug 13 00:31:39.400392 systemd-logind[1593]: New seat seat0. Aug 13 00:31:39.402073 systemd[1]: Started systemd-logind.service - User Login Management. Aug 13 00:31:39.427593 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Aug 13 00:31:39.433816 unknown[1647]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Aug 13 00:31:39.434350 unknown[1647]: Core dump limit set to -1 Aug 13 00:31:39.613650 kernel: mousedev: PS/2 mouse device common for all mice Aug 13 00:31:39.616534 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Aug 13 00:31:39.621297 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 13 00:31:39.636177 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Aug 13 00:31:39.646058 locksmithd[1631]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 13 00:31:39.659112 kernel: ACPI: button: Power Button [PWRF] Aug 13 00:31:39.674136 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 13 00:31:39.680730 containerd[1618]: time="2025-08-13T00:31:39Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 13 00:31:39.682158 containerd[1618]: time="2025-08-13T00:31:39.681453908Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Aug 13 00:31:39.686837 containerd[1618]: time="2025-08-13T00:31:39.686808659Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.403µs" Aug 13 00:31:39.686930 containerd[1618]: time="2025-08-13T00:31:39.686920696Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 13 00:31:39.686972 containerd[1618]: time="2025-08-13T00:31:39.686962620Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 13 00:31:39.687097 containerd[1618]: time="2025-08-13T00:31:39.687087648Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 13 00:31:39.687140 containerd[1618]: time="2025-08-13T00:31:39.687130529Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 13 00:31:39.687193 containerd[1618]: time="2025-08-13T00:31:39.687184787Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 00:31:39.687264 containerd[1618]: time="2025-08-13T00:31:39.687254304Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 00:31:39.687296 containerd[1618]: time="2025-08-13T00:31:39.687289830Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 00:31:39.687594 containerd[1618]: time="2025-08-13T00:31:39.687507648Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 00:31:39.687629 containerd[1618]: time="2025-08-13T00:31:39.687621716Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 00:31:39.687669 containerd[1618]: time="2025-08-13T00:31:39.687660839Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 00:31:39.687701 containerd[1618]: time="2025-08-13T00:31:39.687692528Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 13 00:31:39.687783 containerd[1618]: time="2025-08-13T00:31:39.687774062Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 13 00:31:39.688178 containerd[1618]: time="2025-08-13T00:31:39.688163197Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 00:31:39.688366 containerd[1618]: time="2025-08-13T00:31:39.688350170Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 00:31:39.688410 containerd[1618]: time="2025-08-13T00:31:39.688399443Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 13 00:31:39.688529 containerd[1618]: time="2025-08-13T00:31:39.688519687Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 13 00:31:39.688685 containerd[1618]: time="2025-08-13T00:31:39.688675923Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 13 00:31:39.688861 containerd[1618]: time="2025-08-13T00:31:39.688851706Z" level=info msg="metadata content store policy set" policy=shared Aug 13 00:31:39.690332 containerd[1618]: time="2025-08-13T00:31:39.690318557Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 13 00:31:39.690439 containerd[1618]: time="2025-08-13T00:31:39.690429028Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690619893Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690633445Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690641101Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690647883Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690655162Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690665805Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690677281Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690683187Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690688353Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690695690Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690750517Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690762273Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690770495Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 13 00:31:39.690804 containerd[1618]: time="2025-08-13T00:31:39.690776697Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 13 00:31:39.691014 containerd[1618]: time="2025-08-13T00:31:39.690782886Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 13 00:31:39.691014 containerd[1618]: time="2025-08-13T00:31:39.690788958Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 13 00:31:39.691305 containerd[1618]: time="2025-08-13T00:31:39.690795537Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 13 00:31:39.691305 containerd[1618]: time="2025-08-13T00:31:39.691216451Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 13 00:31:39.691305 containerd[1618]: time="2025-08-13T00:31:39.691226119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 13 00:31:39.691305 containerd[1618]: time="2025-08-13T00:31:39.691233155Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 13 00:31:39.691305 containerd[1618]: time="2025-08-13T00:31:39.691239069Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 13 00:31:39.691305 containerd[1618]: time="2025-08-13T00:31:39.691277435Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 13 00:31:39.691305 containerd[1618]: time="2025-08-13T00:31:39.691291221Z" level=info msg="Start snapshots syncer" Aug 13 00:31:39.691705 containerd[1618]: time="2025-08-13T00:31:39.691656007Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 13 00:31:39.692077 containerd[1618]: time="2025-08-13T00:31:39.691897254Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 13 00:31:39.692077 containerd[1618]: time="2025-08-13T00:31:39.692010375Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693043275Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693142123Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693173192Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693181691Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693188376Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693195677Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693201784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693208737Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693229317Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693236650Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 13 00:31:39.693442 containerd[1618]: time="2025-08-13T00:31:39.693247028Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 13 00:31:39.694516 containerd[1618]: time="2025-08-13T00:31:39.694309438Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 00:31:39.694516 containerd[1618]: time="2025-08-13T00:31:39.694331874Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 00:31:39.694516 containerd[1618]: time="2025-08-13T00:31:39.694338539Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 00:31:39.694516 containerd[1618]: time="2025-08-13T00:31:39.694344196Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 00:31:39.694691 containerd[1618]: time="2025-08-13T00:31:39.694594780Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 13 00:31:39.694691 containerd[1618]: time="2025-08-13T00:31:39.694608682Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 13 00:31:39.694691 containerd[1618]: time="2025-08-13T00:31:39.694617015Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 13 00:31:39.694691 containerd[1618]: time="2025-08-13T00:31:39.694629085Z" level=info msg="runtime interface created" Aug 13 00:31:39.694691 containerd[1618]: time="2025-08-13T00:31:39.694632426Z" level=info msg="created NRI interface" Aug 13 00:31:39.694691 containerd[1618]: time="2025-08-13T00:31:39.694636839Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 13 00:31:39.694691 containerd[1618]: time="2025-08-13T00:31:39.694646780Z" level=info msg="Connect containerd service" Aug 13 00:31:39.694691 containerd[1618]: time="2025-08-13T00:31:39.694672154Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 13 00:31:39.695948 containerd[1618]: time="2025-08-13T00:31:39.695930404Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 13 00:31:39.784350 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Aug 13 00:31:39.849678 containerd[1618]: time="2025-08-13T00:31:39.849627998Z" level=info msg="Start subscribing containerd event" Aug 13 00:31:39.849678 containerd[1618]: time="2025-08-13T00:31:39.849658375Z" level=info msg="Start recovering state" Aug 13 00:31:39.849754 containerd[1618]: time="2025-08-13T00:31:39.849708983Z" level=info msg="Start event monitor" Aug 13 00:31:39.849754 containerd[1618]: time="2025-08-13T00:31:39.849717525Z" level=info msg="Start cni network conf syncer for default" Aug 13 00:31:39.849754 containerd[1618]: time="2025-08-13T00:31:39.849722397Z" level=info msg="Start streaming server" Aug 13 00:31:39.849754 containerd[1618]: time="2025-08-13T00:31:39.849727894Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 13 00:31:39.849754 containerd[1618]: time="2025-08-13T00:31:39.849732523Z" level=info msg="runtime interface starting up..." Aug 13 00:31:39.849754 containerd[1618]: time="2025-08-13T00:31:39.849735270Z" level=info msg="starting plugins..." Aug 13 00:31:39.849754 containerd[1618]: time="2025-08-13T00:31:39.849742681Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 13 00:31:39.849868 containerd[1618]: time="2025-08-13T00:31:39.849842598Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 13 00:31:39.849895 containerd[1618]: time="2025-08-13T00:31:39.849882923Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 13 00:31:39.850585 systemd[1]: Started containerd.service - containerd container runtime. Aug 13 00:31:39.851061 containerd[1618]: time="2025-08-13T00:31:39.851002455Z" level=info msg="containerd successfully booted in 0.171011s" Aug 13 00:31:39.872586 sshd_keygen[1625]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 13 00:31:39.901835 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 13 00:31:39.906391 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 13 00:31:39.921252 systemd[1]: issuegen.service: Deactivated successfully. Aug 13 00:31:39.921411 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 13 00:31:39.925260 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 13 00:31:39.943709 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 13 00:31:39.945897 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 13 00:31:39.951261 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Aug 13 00:31:39.951473 systemd[1]: Reached target getty.target - Login Prompts. Aug 13 00:31:39.988195 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:31:39.992706 systemd-logind[1593]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 13 00:31:40.004442 (udev-worker)[1548]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Aug 13 00:31:40.005591 systemd-logind[1593]: Watching system buttons on /dev/input/event2 (Power Button) Aug 13 00:31:40.024967 tar[1606]: linux-amd64/LICENSE Aug 13 00:31:40.025037 tar[1606]: linux-amd64/README.md Aug 13 00:31:40.043560 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 13 00:31:40.093697 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:31:40.444306 systemd-networkd[1540]: ens192: Gained IPv6LL Aug 13 00:31:40.444700 systemd-timesyncd[1515]: Network configuration changed, trying to establish connection. Aug 13 00:31:40.445776 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 13 00:31:40.446309 systemd[1]: Reached target network-online.target - Network is Online. Aug 13 00:31:40.447667 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Aug 13 00:31:40.448985 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:31:40.451315 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 13 00:31:40.468358 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 13 00:31:40.488017 systemd[1]: coreos-metadata.service: Deactivated successfully. Aug 13 00:31:40.488497 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Aug 13 00:31:40.489102 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 13 00:31:41.696618 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:31:41.697020 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 13 00:31:41.697578 systemd[1]: Startup finished in 2.781s (kernel) + 6.567s (initrd) + 4.572s (userspace) = 13.921s. Aug 13 00:31:41.702466 (kubelet)[1804]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:31:41.738443 login[1757]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 13 00:31:41.739519 login[1758]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 13 00:31:41.744847 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 13 00:31:41.745619 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 13 00:31:41.753105 systemd-logind[1593]: New session 1 of user core. Aug 13 00:31:41.759225 systemd-logind[1593]: New session 2 of user core. Aug 13 00:31:41.765404 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 13 00:31:41.767858 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 13 00:31:41.788454 (systemd)[1811]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 13 00:31:41.790463 systemd-logind[1593]: New session c1 of user core. Aug 13 00:31:41.892944 systemd[1811]: Queued start job for default target default.target. Aug 13 00:31:41.903032 systemd[1811]: Created slice app.slice - User Application Slice. Aug 13 00:31:41.903066 systemd[1811]: Reached target paths.target - Paths. Aug 13 00:31:41.903097 systemd[1811]: Reached target timers.target - Timers. Aug 13 00:31:41.903856 systemd[1811]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 13 00:31:41.915981 systemd[1811]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 13 00:31:41.916047 systemd[1811]: Reached target sockets.target - Sockets. Aug 13 00:31:41.916071 systemd[1811]: Reached target basic.target - Basic System. Aug 13 00:31:41.916092 systemd[1811]: Reached target default.target - Main User Target. Aug 13 00:31:41.916108 systemd[1811]: Startup finished in 122ms. Aug 13 00:31:41.916630 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 13 00:31:41.926262 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 13 00:31:41.927039 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 13 00:31:41.956416 systemd-timesyncd[1515]: Network configuration changed, trying to establish connection. Aug 13 00:31:42.356999 kubelet[1804]: E0813 00:31:42.356933 1804 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:31:42.359080 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:31:42.359226 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:31:42.359496 systemd[1]: kubelet.service: Consumed 644ms CPU time, 265.3M memory peak. Aug 13 00:31:52.609505 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 13 00:31:52.610581 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:31:52.842950 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:31:52.852332 (kubelet)[1855]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:31:52.896168 kubelet[1855]: E0813 00:31:52.896080 1855 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:31:52.898470 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:31:52.898555 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:31:52.898881 systemd[1]: kubelet.service: Consumed 95ms CPU time, 108.6M memory peak. Aug 13 00:32:03.065623 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 13 00:32:03.066821 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:32:03.438868 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:32:03.448369 (kubelet)[1870]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:32:03.474292 kubelet[1870]: E0813 00:32:03.474238 1870 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:32:03.475849 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:32:03.475986 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:32:03.476329 systemd[1]: kubelet.service: Consumed 100ms CPU time, 108.1M memory peak. Aug 13 00:32:09.582658 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 13 00:32:09.583734 systemd[1]: Started sshd@0-139.178.70.101:22-147.75.109.163:44768.service - OpenSSH per-connection server daemon (147.75.109.163:44768). Aug 13 00:32:09.628245 sshd[1878]: Accepted publickey for core from 147.75.109.163 port 44768 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:32:09.629219 sshd-session[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:09.632322 systemd-logind[1593]: New session 3 of user core. Aug 13 00:32:09.641329 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 13 00:32:09.698359 systemd[1]: Started sshd@1-139.178.70.101:22-147.75.109.163:44784.service - OpenSSH per-connection server daemon (147.75.109.163:44784). Aug 13 00:32:09.737510 sshd[1883]: Accepted publickey for core from 147.75.109.163 port 44784 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:32:09.738604 sshd-session[1883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:09.742709 systemd-logind[1593]: New session 4 of user core. Aug 13 00:32:09.749286 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 13 00:32:09.797311 sshd[1885]: Connection closed by 147.75.109.163 port 44784 Aug 13 00:32:09.797636 sshd-session[1883]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:09.806134 systemd[1]: sshd@1-139.178.70.101:22-147.75.109.163:44784.service: Deactivated successfully. Aug 13 00:32:09.807011 systemd[1]: session-4.scope: Deactivated successfully. Aug 13 00:32:09.807771 systemd-logind[1593]: Session 4 logged out. Waiting for processes to exit. Aug 13 00:32:09.808955 systemd[1]: Started sshd@2-139.178.70.101:22-147.75.109.163:44788.service - OpenSSH per-connection server daemon (147.75.109.163:44788). Aug 13 00:32:09.810358 systemd-logind[1593]: Removed session 4. Aug 13 00:32:09.844403 sshd[1891]: Accepted publickey for core from 147.75.109.163 port 44788 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:32:09.845341 sshd-session[1891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:09.848334 systemd-logind[1593]: New session 5 of user core. Aug 13 00:32:09.857331 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 13 00:32:09.905176 sshd[1893]: Connection closed by 147.75.109.163 port 44788 Aug 13 00:32:09.905111 sshd-session[1891]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:09.914418 systemd[1]: sshd@2-139.178.70.101:22-147.75.109.163:44788.service: Deactivated successfully. Aug 13 00:32:09.915444 systemd[1]: session-5.scope: Deactivated successfully. Aug 13 00:32:09.915968 systemd-logind[1593]: Session 5 logged out. Waiting for processes to exit. Aug 13 00:32:09.917428 systemd[1]: Started sshd@3-139.178.70.101:22-147.75.109.163:44800.service - OpenSSH per-connection server daemon (147.75.109.163:44800). Aug 13 00:32:09.918082 systemd-logind[1593]: Removed session 5. Aug 13 00:32:09.953317 sshd[1899]: Accepted publickey for core from 147.75.109.163 port 44800 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:32:09.954387 sshd-session[1899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:09.957333 systemd-logind[1593]: New session 6 of user core. Aug 13 00:32:09.964344 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 13 00:32:10.013655 sshd[1901]: Connection closed by 147.75.109.163 port 44800 Aug 13 00:32:10.013603 sshd-session[1899]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:10.020806 systemd[1]: sshd@3-139.178.70.101:22-147.75.109.163:44800.service: Deactivated successfully. Aug 13 00:32:10.021908 systemd[1]: session-6.scope: Deactivated successfully. Aug 13 00:32:10.022555 systemd-logind[1593]: Session 6 logged out. Waiting for processes to exit. Aug 13 00:32:10.024404 systemd[1]: Started sshd@4-139.178.70.101:22-147.75.109.163:44810.service - OpenSSH per-connection server daemon (147.75.109.163:44810). Aug 13 00:32:10.025396 systemd-logind[1593]: Removed session 6. Aug 13 00:32:10.064919 sshd[1907]: Accepted publickey for core from 147.75.109.163 port 44810 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:32:10.065775 sshd-session[1907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:10.069182 systemd-logind[1593]: New session 7 of user core. Aug 13 00:32:10.078328 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 13 00:32:10.139422 sudo[1910]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 13 00:32:10.139590 sudo[1910]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:32:10.153655 sudo[1910]: pam_unix(sudo:session): session closed for user root Aug 13 00:32:10.155158 sshd[1909]: Connection closed by 147.75.109.163 port 44810 Aug 13 00:32:10.155460 sshd-session[1907]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:10.163539 systemd[1]: sshd@4-139.178.70.101:22-147.75.109.163:44810.service: Deactivated successfully. Aug 13 00:32:10.164468 systemd[1]: session-7.scope: Deactivated successfully. Aug 13 00:32:10.164998 systemd-logind[1593]: Session 7 logged out. Waiting for processes to exit. Aug 13 00:32:10.166843 systemd[1]: Started sshd@5-139.178.70.101:22-147.75.109.163:44822.service - OpenSSH per-connection server daemon (147.75.109.163:44822). Aug 13 00:32:10.167472 systemd-logind[1593]: Removed session 7. Aug 13 00:32:10.206295 sshd[1916]: Accepted publickey for core from 147.75.109.163 port 44822 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:32:10.207193 sshd-session[1916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:10.210300 systemd-logind[1593]: New session 8 of user core. Aug 13 00:32:10.220325 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 13 00:32:10.268390 sudo[1920]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 13 00:32:10.268551 sudo[1920]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:32:10.274566 sudo[1920]: pam_unix(sudo:session): session closed for user root Aug 13 00:32:10.277525 sudo[1919]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 13 00:32:10.277811 sudo[1919]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:32:10.283457 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 00:32:10.316925 augenrules[1942]: No rules Aug 13 00:32:10.317633 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:32:10.317867 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 00:32:10.318743 sudo[1919]: pam_unix(sudo:session): session closed for user root Aug 13 00:32:10.319553 sshd[1918]: Connection closed by 147.75.109.163 port 44822 Aug 13 00:32:10.319911 sshd-session[1916]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:10.325909 systemd[1]: sshd@5-139.178.70.101:22-147.75.109.163:44822.service: Deactivated successfully. Aug 13 00:32:10.327321 systemd[1]: session-8.scope: Deactivated successfully. Aug 13 00:32:10.327848 systemd-logind[1593]: Session 8 logged out. Waiting for processes to exit. Aug 13 00:32:10.330049 systemd[1]: Started sshd@6-139.178.70.101:22-147.75.109.163:44828.service - OpenSSH per-connection server daemon (147.75.109.163:44828). Aug 13 00:32:10.330855 systemd-logind[1593]: Removed session 8. Aug 13 00:32:10.363741 sshd[1951]: Accepted publickey for core from 147.75.109.163 port 44828 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:32:10.364560 sshd-session[1951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:10.367350 systemd-logind[1593]: New session 9 of user core. Aug 13 00:32:10.377309 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 13 00:32:10.426930 sudo[1954]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 13 00:32:10.427094 sudo[1954]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:32:10.921069 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 13 00:32:10.929434 (dockerd)[1972]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 13 00:32:11.346283 dockerd[1972]: time="2025-08-13T00:32:11.346249692Z" level=info msg="Starting up" Aug 13 00:32:11.346713 dockerd[1972]: time="2025-08-13T00:32:11.346693150Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 13 00:32:11.425479 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1909468295-merged.mount: Deactivated successfully. Aug 13 00:32:11.611692 dockerd[1972]: time="2025-08-13T00:32:11.611626829Z" level=info msg="Loading containers: start." Aug 13 00:32:11.675182 kernel: Initializing XFRM netlink socket Aug 13 00:32:12.002680 systemd-timesyncd[1515]: Network configuration changed, trying to establish connection. Aug 13 00:32:12.057648 systemd-networkd[1540]: docker0: Link UP Aug 13 00:32:12.064261 dockerd[1972]: time="2025-08-13T00:32:12.064235900Z" level=info msg="Loading containers: done." Aug 13 00:32:12.073032 dockerd[1972]: time="2025-08-13T00:32:12.073006839Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 13 00:32:12.073114 dockerd[1972]: time="2025-08-13T00:32:12.073067137Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Aug 13 00:32:12.073143 dockerd[1972]: time="2025-08-13T00:32:12.073130851Z" level=info msg="Initializing buildkit" Aug 13 00:32:12.083818 dockerd[1972]: time="2025-08-13T00:32:12.083688592Z" level=info msg="Completed buildkit initialization" Aug 13 00:32:12.088978 dockerd[1972]: time="2025-08-13T00:32:12.088943306Z" level=info msg="Daemon has completed initialization" Aug 13 00:32:12.089154 dockerd[1972]: time="2025-08-13T00:32:12.089128982Z" level=info msg="API listen on /run/docker.sock" Aug 13 00:32:12.089154 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 13 00:33:41.784679 systemd-timesyncd[1515]: Contacted time server 66.118.231.14:123 (2.flatcar.pool.ntp.org). Aug 13 00:33:41.784703 systemd-resolved[1484]: Clock change detected. Flushing caches. Aug 13 00:33:41.785124 systemd-timesyncd[1515]: Initial clock synchronization to Wed 2025-08-13 00:33:41.784502 UTC. Aug 13 00:33:42.733946 containerd[1618]: time="2025-08-13T00:33:42.733909308Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.11\"" Aug 13 00:33:43.148027 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 13 00:33:43.149635 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:33:43.399055 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:33:43.401387 (kubelet)[2185]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:33:43.424827 kubelet[2185]: E0813 00:33:43.424755 2185 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:33:43.426327 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:33:43.426468 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:33:43.426871 systemd[1]: kubelet.service: Consumed 103ms CPU time, 110.2M memory peak. Aug 13 00:33:43.628512 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2975378870.mount: Deactivated successfully. Aug 13 00:33:45.018062 containerd[1618]: time="2025-08-13T00:33:45.017602321Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:45.025472 containerd[1618]: time="2025-08-13T00:33:45.025458942Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.11: active requests=0, bytes read=28077759" Aug 13 00:33:45.028772 containerd[1618]: time="2025-08-13T00:33:45.028149889Z" level=info msg="ImageCreate event name:\"sha256:ea7fa3cfabed1b85e7de8e0a02356b6dcb7708442d6e4600d68abaebe1e9b1fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:45.036087 containerd[1618]: time="2025-08-13T00:33:45.036069611Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a3d1c4440817725a1b503a7ccce94f3dce2b208ebf257b405dc2d97817df3dde\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:45.036611 containerd[1618]: time="2025-08-13T00:33:45.036598168Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.11\" with image id \"sha256:ea7fa3cfabed1b85e7de8e0a02356b6dcb7708442d6e4600d68abaebe1e9b1fc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a3d1c4440817725a1b503a7ccce94f3dce2b208ebf257b405dc2d97817df3dde\", size \"28074559\" in 2.302652152s" Aug 13 00:33:45.036662 containerd[1618]: time="2025-08-13T00:33:45.036654208Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.11\" returns image reference \"sha256:ea7fa3cfabed1b85e7de8e0a02356b6dcb7708442d6e4600d68abaebe1e9b1fc\"" Aug 13 00:33:45.037064 containerd[1618]: time="2025-08-13T00:33:45.037049392Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.11\"" Aug 13 00:33:46.681578 containerd[1618]: time="2025-08-13T00:33:46.681524023Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:46.808471 containerd[1618]: time="2025-08-13T00:33:46.808428125Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.11: active requests=0, bytes read=24713245" Aug 13 00:33:47.291500 containerd[1618]: time="2025-08-13T00:33:47.291283089Z" level=info msg="ImageCreate event name:\"sha256:c057eceea4b436b01f9ce394734cfb06f13b2a3688c3983270e99743370b6051\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:47.302532 containerd[1618]: time="2025-08-13T00:33:47.302479264Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:0f19de157f3d251f5ddeb6e9d026895bc55cb02592874b326fa345c57e5e2848\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:47.303070 containerd[1618]: time="2025-08-13T00:33:47.302967644Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.11\" with image id \"sha256:c057eceea4b436b01f9ce394734cfb06f13b2a3688c3983270e99743370b6051\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:0f19de157f3d251f5ddeb6e9d026895bc55cb02592874b326fa345c57e5e2848\", size \"26315079\" in 2.265898095s" Aug 13 00:33:47.303070 containerd[1618]: time="2025-08-13T00:33:47.302987857Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.11\" returns image reference \"sha256:c057eceea4b436b01f9ce394734cfb06f13b2a3688c3983270e99743370b6051\"" Aug 13 00:33:47.303311 containerd[1618]: time="2025-08-13T00:33:47.303292701Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.11\"" Aug 13 00:33:48.449214 containerd[1618]: time="2025-08-13T00:33:48.449171689Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:48.449775 containerd[1618]: time="2025-08-13T00:33:48.449752727Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.11: active requests=0, bytes read=18783700" Aug 13 00:33:48.450199 containerd[1618]: time="2025-08-13T00:33:48.449995851Z" level=info msg="ImageCreate event name:\"sha256:64e6a0b453108c87da0bb61473b35fd54078119a09edc56a4c8cb31602437c58\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:48.451282 containerd[1618]: time="2025-08-13T00:33:48.451269818Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:1a9b59b3bfa6c1f1911f6f865a795620c461d079e413061bb71981cadd67f39d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:48.451819 containerd[1618]: time="2025-08-13T00:33:48.451807342Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.11\" with image id \"sha256:64e6a0b453108c87da0bb61473b35fd54078119a09edc56a4c8cb31602437c58\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:1a9b59b3bfa6c1f1911f6f865a795620c461d079e413061bb71981cadd67f39d\", size \"20385552\" in 1.148499631s" Aug 13 00:33:48.451866 containerd[1618]: time="2025-08-13T00:33:48.451858452Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.11\" returns image reference \"sha256:64e6a0b453108c87da0bb61473b35fd54078119a09edc56a4c8cb31602437c58\"" Aug 13 00:33:48.452157 containerd[1618]: time="2025-08-13T00:33:48.452147557Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.11\"" Aug 13 00:33:49.553182 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3055376992.mount: Deactivated successfully. Aug 13 00:33:49.908555 containerd[1618]: time="2025-08-13T00:33:49.908478108Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:49.917944 containerd[1618]: time="2025-08-13T00:33:49.917923722Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.11: active requests=0, bytes read=30383612" Aug 13 00:33:49.926128 containerd[1618]: time="2025-08-13T00:33:49.926098463Z" level=info msg="ImageCreate event name:\"sha256:0cec28fd5c3c446ec52e2886ddea38bf7f7e17755aa5d0095d50d3df5914a8fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:49.931556 containerd[1618]: time="2025-08-13T00:33:49.931522168Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a31da847792c5e7e92e91b78da1ad21d693e4b2b48d0e9f4610c8764dc2a5d79\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:49.931882 containerd[1618]: time="2025-08-13T00:33:49.931788499Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.11\" with image id \"sha256:0cec28fd5c3c446ec52e2886ddea38bf7f7e17755aa5d0095d50d3df5914a8fd\", repo tag \"registry.k8s.io/kube-proxy:v1.31.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:a31da847792c5e7e92e91b78da1ad21d693e4b2b48d0e9f4610c8764dc2a5d79\", size \"30382631\" in 1.479592803s" Aug 13 00:33:49.931882 containerd[1618]: time="2025-08-13T00:33:49.931809005Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.11\" returns image reference \"sha256:0cec28fd5c3c446ec52e2886ddea38bf7f7e17755aa5d0095d50d3df5914a8fd\"" Aug 13 00:33:49.932052 containerd[1618]: time="2025-08-13T00:33:49.932035053Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Aug 13 00:33:50.588908 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3010442780.mount: Deactivated successfully. Aug 13 00:33:51.282416 containerd[1618]: time="2025-08-13T00:33:51.282381208Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:51.295561 containerd[1618]: time="2025-08-13T00:33:51.295527864Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Aug 13 00:33:51.303760 containerd[1618]: time="2025-08-13T00:33:51.303734085Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:51.313963 containerd[1618]: time="2025-08-13T00:33:51.313931125Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:51.314420 containerd[1618]: time="2025-08-13T00:33:51.314344498Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.382294255s" Aug 13 00:33:51.314420 containerd[1618]: time="2025-08-13T00:33:51.314363232Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Aug 13 00:33:51.314800 containerd[1618]: time="2025-08-13T00:33:51.314701365Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 13 00:33:51.833275 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3383567549.mount: Deactivated successfully. Aug 13 00:33:51.834915 containerd[1618]: time="2025-08-13T00:33:51.834894708Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:33:51.835439 containerd[1618]: time="2025-08-13T00:33:51.835418403Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Aug 13 00:33:51.835743 containerd[1618]: time="2025-08-13T00:33:51.835726444Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:33:51.836736 containerd[1618]: time="2025-08-13T00:33:51.836720237Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:33:51.837319 containerd[1618]: time="2025-08-13T00:33:51.837302513Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 522.558061ms" Aug 13 00:33:51.837344 containerd[1618]: time="2025-08-13T00:33:51.837319048Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 13 00:33:51.837693 containerd[1618]: time="2025-08-13T00:33:51.837678427Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Aug 13 00:33:52.422555 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2376555412.mount: Deactivated successfully. Aug 13 00:33:53.648026 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Aug 13 00:33:53.649592 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:33:54.259946 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:33:54.262464 (kubelet)[2374]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:33:54.334121 kubelet[2374]: E0813 00:33:54.334074 2374 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:33:54.335674 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:33:54.335829 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:33:54.336199 systemd[1]: kubelet.service: Consumed 100ms CPU time, 108.6M memory peak. Aug 13 00:33:54.363621 update_engine[1594]: I20250813 00:33:54.363576 1594 update_attempter.cc:509] Updating boot flags... Aug 13 00:33:56.684120 containerd[1618]: time="2025-08-13T00:33:56.683812680Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:56.689705 containerd[1618]: time="2025-08-13T00:33:56.689671100Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780013" Aug 13 00:33:56.695109 containerd[1618]: time="2025-08-13T00:33:56.695066293Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:56.699852 containerd[1618]: time="2025-08-13T00:33:56.699806024Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:56.700547 containerd[1618]: time="2025-08-13T00:33:56.700264318Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 4.862569613s" Aug 13 00:33:56.700547 containerd[1618]: time="2025-08-13T00:33:56.700286000Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Aug 13 00:33:59.202811 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:33:59.203188 systemd[1]: kubelet.service: Consumed 100ms CPU time, 108.6M memory peak. Aug 13 00:33:59.204891 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:33:59.222451 systemd[1]: Reload requested from client PID 2434 ('systemctl') (unit session-9.scope)... Aug 13 00:33:59.222461 systemd[1]: Reloading... Aug 13 00:33:59.295570 zram_generator::config[2484]: No configuration found. Aug 13 00:33:59.362219 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:33:59.370708 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Aug 13 00:33:59.438820 systemd[1]: Reloading finished in 216 ms. Aug 13 00:33:59.520957 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 13 00:33:59.521038 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 13 00:33:59.521260 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:33:59.522928 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:00.165001 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:00.168454 (kubelet)[2545]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:34:00.238781 kubelet[2545]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:00.238781 kubelet[2545]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 13 00:34:00.238781 kubelet[2545]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:00.239017 kubelet[2545]: I0813 00:34:00.238836 2545 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:34:00.585597 kubelet[2545]: I0813 00:34:00.585074 2545 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 13 00:34:00.585597 kubelet[2545]: I0813 00:34:00.585096 2545 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:34:00.585597 kubelet[2545]: I0813 00:34:00.585243 2545 server.go:934] "Client rotation is on, will bootstrap in background" Aug 13 00:34:00.621681 kubelet[2545]: I0813 00:34:00.621660 2545 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:34:00.623317 kubelet[2545]: E0813 00:34:00.622294 2545 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.101:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:00.640955 kubelet[2545]: I0813 00:34:00.640939 2545 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 00:34:00.647510 kubelet[2545]: I0813 00:34:00.647475 2545 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:34:00.648648 kubelet[2545]: I0813 00:34:00.648622 2545 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 13 00:34:00.648769 kubelet[2545]: I0813 00:34:00.648740 2545 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:34:00.648892 kubelet[2545]: I0813 00:34:00.648768 2545 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:34:00.648959 kubelet[2545]: I0813 00:34:00.648896 2545 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:34:00.648959 kubelet[2545]: I0813 00:34:00.648904 2545 container_manager_linux.go:300] "Creating device plugin manager" Aug 13 00:34:00.648999 kubelet[2545]: I0813 00:34:00.648980 2545 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:00.653125 kubelet[2545]: I0813 00:34:00.653095 2545 kubelet.go:408] "Attempting to sync node with API server" Aug 13 00:34:00.653125 kubelet[2545]: I0813 00:34:00.653125 2545 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:34:00.655000 kubelet[2545]: I0813 00:34:00.654981 2545 kubelet.go:314] "Adding apiserver pod source" Aug 13 00:34:00.655054 kubelet[2545]: I0813 00:34:00.655009 2545 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:34:00.660986 kubelet[2545]: W0813 00:34:00.660776 2545 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Aug 13 00:34:00.660986 kubelet[2545]: E0813 00:34:00.660843 2545 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:00.660986 kubelet[2545]: I0813 00:34:00.660909 2545 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 00:34:00.663049 kubelet[2545]: W0813 00:34:00.662958 2545 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Aug 13 00:34:00.663049 kubelet[2545]: E0813 00:34:00.663001 2545 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:00.664612 kubelet[2545]: I0813 00:34:00.664592 2545 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 13 00:34:00.664692 kubelet[2545]: W0813 00:34:00.664647 2545 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 13 00:34:00.665039 kubelet[2545]: I0813 00:34:00.665025 2545 server.go:1274] "Started kubelet" Aug 13 00:34:00.665935 kubelet[2545]: I0813 00:34:00.665857 2545 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:34:00.668683 kubelet[2545]: I0813 00:34:00.668187 2545 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:34:00.668683 kubelet[2545]: I0813 00:34:00.668441 2545 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:34:00.670800 kubelet[2545]: I0813 00:34:00.670267 2545 server.go:449] "Adding debug handlers to kubelet server" Aug 13 00:34:00.670800 kubelet[2545]: E0813 00:34:00.668579 2545 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.101:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.101:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.185b2c590693e827 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-08-13 00:34:00.665008167 +0000 UTC m=+0.494154326,LastTimestamp:2025-08-13 00:34:00.665008167 +0000 UTC m=+0.494154326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Aug 13 00:34:00.672344 kubelet[2545]: I0813 00:34:00.671926 2545 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:34:00.672344 kubelet[2545]: I0813 00:34:00.672014 2545 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:34:00.677875 kubelet[2545]: I0813 00:34:00.677745 2545 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 13 00:34:00.677875 kubelet[2545]: I0813 00:34:00.677818 2545 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 13 00:34:00.679251 kubelet[2545]: E0813 00:34:00.679231 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:00.685553 kubelet[2545]: I0813 00:34:00.685518 2545 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:34:00.685821 kubelet[2545]: E0813 00:34:00.685801 2545 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.101:6443: connect: connection refused" interval="200ms" Aug 13 00:34:00.687111 kubelet[2545]: I0813 00:34:00.687085 2545 factory.go:221] Registration of the systemd container factory successfully Aug 13 00:34:00.687675 kubelet[2545]: I0813 00:34:00.687657 2545 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:34:00.688216 kubelet[2545]: W0813 00:34:00.688103 2545 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Aug 13 00:34:00.688216 kubelet[2545]: E0813 00:34:00.688141 2545 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:00.689707 kubelet[2545]: I0813 00:34:00.689643 2545 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 13 00:34:00.689854 kubelet[2545]: I0813 00:34:00.689835 2545 factory.go:221] Registration of the containerd container factory successfully Aug 13 00:34:00.692100 kubelet[2545]: I0813 00:34:00.692077 2545 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 13 00:34:00.692598 kubelet[2545]: I0813 00:34:00.692256 2545 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 13 00:34:00.692598 kubelet[2545]: I0813 00:34:00.692285 2545 kubelet.go:2321] "Starting kubelet main sync loop" Aug 13 00:34:00.692598 kubelet[2545]: E0813 00:34:00.692319 2545 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:34:00.707619 kubelet[2545]: W0813 00:34:00.707527 2545 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Aug 13 00:34:00.707619 kubelet[2545]: E0813 00:34:00.707576 2545 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:00.710145 kubelet[2545]: I0813 00:34:00.710107 2545 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 13 00:34:00.710145 kubelet[2545]: I0813 00:34:00.710123 2545 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 13 00:34:00.710145 kubelet[2545]: I0813 00:34:00.710144 2545 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:00.711255 kubelet[2545]: I0813 00:34:00.711238 2545 policy_none.go:49] "None policy: Start" Aug 13 00:34:00.711705 kubelet[2545]: I0813 00:34:00.711689 2545 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 13 00:34:00.711705 kubelet[2545]: I0813 00:34:00.711706 2545 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:34:00.718048 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 13 00:34:00.726880 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 13 00:34:00.729914 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 13 00:34:00.754557 kubelet[2545]: I0813 00:34:00.754353 2545 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 13 00:34:00.754557 kubelet[2545]: I0813 00:34:00.754482 2545 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:34:00.754557 kubelet[2545]: I0813 00:34:00.754494 2545 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:34:00.755400 kubelet[2545]: I0813 00:34:00.755386 2545 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:34:00.756911 kubelet[2545]: E0813 00:34:00.756892 2545 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Aug 13 00:34:00.799863 systemd[1]: Created slice kubepods-burstable-pod30861c3376c0e7b1c34a6b7ec3541676.slice - libcontainer container kubepods-burstable-pod30861c3376c0e7b1c34a6b7ec3541676.slice. Aug 13 00:34:00.824444 systemd[1]: Created slice kubepods-burstable-pod407c569889bb86d746b0274843003fd0.slice - libcontainer container kubepods-burstable-pod407c569889bb86d746b0274843003fd0.slice. Aug 13 00:34:00.827990 systemd[1]: Created slice kubepods-burstable-pod27e4a50e94f48ec00f6bd509cb48ed05.slice - libcontainer container kubepods-burstable-pod27e4a50e94f48ec00f6bd509cb48ed05.slice. Aug 13 00:34:00.856442 kubelet[2545]: I0813 00:34:00.856180 2545 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Aug 13 00:34:00.856442 kubelet[2545]: E0813 00:34:00.856420 2545 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.101:6443/api/v1/nodes\": dial tcp 139.178.70.101:6443: connect: connection refused" node="localhost" Aug 13 00:34:00.886898 kubelet[2545]: I0813 00:34:00.886822 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/30861c3376c0e7b1c34a6b7ec3541676-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"30861c3376c0e7b1c34a6b7ec3541676\") " pod="kube-system/kube-apiserver-localhost" Aug 13 00:34:00.886898 kubelet[2545]: I0813 00:34:00.886850 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/30861c3376c0e7b1c34a6b7ec3541676-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"30861c3376c0e7b1c34a6b7ec3541676\") " pod="kube-system/kube-apiserver-localhost" Aug 13 00:34:00.886898 kubelet[2545]: I0813 00:34:00.886864 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/30861c3376c0e7b1c34a6b7ec3541676-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"30861c3376c0e7b1c34a6b7ec3541676\") " pod="kube-system/kube-apiserver-localhost" Aug 13 00:34:00.886898 kubelet[2545]: I0813 00:34:00.886875 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:00.887049 kubelet[2545]: E0813 00:34:00.887008 2545 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.101:6443: connect: connection refused" interval="400ms" Aug 13 00:34:00.887131 kubelet[2545]: I0813 00:34:00.887075 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:00.887131 kubelet[2545]: I0813 00:34:00.887089 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:00.887131 kubelet[2545]: I0813 00:34:00.887097 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:00.887131 kubelet[2545]: I0813 00:34:00.887105 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:00.887131 kubelet[2545]: I0813 00:34:00.887116 2545 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/27e4a50e94f48ec00f6bd509cb48ed05-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"27e4a50e94f48ec00f6bd509cb48ed05\") " pod="kube-system/kube-scheduler-localhost" Aug 13 00:34:01.057626 kubelet[2545]: I0813 00:34:01.057581 2545 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Aug 13 00:34:01.057924 kubelet[2545]: E0813 00:34:01.057908 2545 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.101:6443/api/v1/nodes\": dial tcp 139.178.70.101:6443: connect: connection refused" node="localhost" Aug 13 00:34:01.123029 containerd[1618]: time="2025-08-13T00:34:01.122962714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:30861c3376c0e7b1c34a6b7ec3541676,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:01.131836 containerd[1618]: time="2025-08-13T00:34:01.131678496Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:407c569889bb86d746b0274843003fd0,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:01.132165 containerd[1618]: time="2025-08-13T00:34:01.132148667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:27e4a50e94f48ec00f6bd509cb48ed05,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:01.287913 kubelet[2545]: E0813 00:34:01.287889 2545 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.101:6443: connect: connection refused" interval="800ms" Aug 13 00:34:01.338838 containerd[1618]: time="2025-08-13T00:34:01.338808128Z" level=info msg="connecting to shim 8f9ef4c8664d983bd829ffc94a9e0b2824ddf293ddb31f73bbcd6a5311c9277d" address="unix:///run/containerd/s/93cb47d97edb82efa178dbce2b432e57468b0dc65fb5c835b0611a190775c22c" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:01.341178 containerd[1618]: time="2025-08-13T00:34:01.341110200Z" level=info msg="connecting to shim cf434aef604f59b30d8e672eb4043a4105d477d5cd5e0591015d72f05649e238" address="unix:///run/containerd/s/a99c6982d49964bf49f922895dc84b086fc9463f64211d0e0904a716ab72a8e1" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:01.346018 containerd[1618]: time="2025-08-13T00:34:01.345994946Z" level=info msg="connecting to shim 84c586d323a5aaa999a932129618b856e3aa5af9c2353954de823ecdc4af6f0d" address="unix:///run/containerd/s/2dc0a59508ffbed65e01b0ee166d29be790ef25371b438c9f0e83ee189df8601" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:01.460608 kubelet[2545]: I0813 00:34:01.460382 2545 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Aug 13 00:34:01.460716 kubelet[2545]: E0813 00:34:01.460677 2545 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.101:6443/api/v1/nodes\": dial tcp 139.178.70.101:6443: connect: connection refused" node="localhost" Aug 13 00:34:01.540851 systemd[1]: Started cri-containerd-84c586d323a5aaa999a932129618b856e3aa5af9c2353954de823ecdc4af6f0d.scope - libcontainer container 84c586d323a5aaa999a932129618b856e3aa5af9c2353954de823ecdc4af6f0d. Aug 13 00:34:01.542669 systemd[1]: Started cri-containerd-8f9ef4c8664d983bd829ffc94a9e0b2824ddf293ddb31f73bbcd6a5311c9277d.scope - libcontainer container 8f9ef4c8664d983bd829ffc94a9e0b2824ddf293ddb31f73bbcd6a5311c9277d. Aug 13 00:34:01.545853 systemd[1]: Started cri-containerd-cf434aef604f59b30d8e672eb4043a4105d477d5cd5e0591015d72f05649e238.scope - libcontainer container cf434aef604f59b30d8e672eb4043a4105d477d5cd5e0591015d72f05649e238. Aug 13 00:34:01.625558 kubelet[2545]: W0813 00:34:01.625493 2545 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Aug 13 00:34:01.625558 kubelet[2545]: E0813 00:34:01.625520 2545 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:01.662594 containerd[1618]: time="2025-08-13T00:34:01.662510131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:407c569889bb86d746b0274843003fd0,Namespace:kube-system,Attempt:0,} returns sandbox id \"cf434aef604f59b30d8e672eb4043a4105d477d5cd5e0591015d72f05649e238\"" Aug 13 00:34:01.665072 containerd[1618]: time="2025-08-13T00:34:01.665044705Z" level=info msg="CreateContainer within sandbox \"cf434aef604f59b30d8e672eb4043a4105d477d5cd5e0591015d72f05649e238\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 13 00:34:01.761420 containerd[1618]: time="2025-08-13T00:34:01.761289320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:27e4a50e94f48ec00f6bd509cb48ed05,Namespace:kube-system,Attempt:0,} returns sandbox id \"8f9ef4c8664d983bd829ffc94a9e0b2824ddf293ddb31f73bbcd6a5311c9277d\"" Aug 13 00:34:01.764563 containerd[1618]: time="2025-08-13T00:34:01.764494393Z" level=info msg="CreateContainer within sandbox \"8f9ef4c8664d983bd829ffc94a9e0b2824ddf293ddb31f73bbcd6a5311c9277d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 13 00:34:01.782514 containerd[1618]: time="2025-08-13T00:34:01.782481118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:30861c3376c0e7b1c34a6b7ec3541676,Namespace:kube-system,Attempt:0,} returns sandbox id \"84c586d323a5aaa999a932129618b856e3aa5af9c2353954de823ecdc4af6f0d\"" Aug 13 00:34:01.784087 containerd[1618]: time="2025-08-13T00:34:01.784065467Z" level=info msg="CreateContainer within sandbox \"84c586d323a5aaa999a932129618b856e3aa5af9c2353954de823ecdc4af6f0d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 13 00:34:01.825708 kubelet[2545]: W0813 00:34:01.825655 2545 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Aug 13 00:34:01.825858 kubelet[2545]: E0813 00:34:01.825833 2545 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:01.978450 containerd[1618]: time="2025-08-13T00:34:01.978292084Z" level=info msg="Container 78fc6d600469df9ad911f910b2245e54e778fb201d81c425f06209cdd54ac0f1: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:02.033082 containerd[1618]: time="2025-08-13T00:34:02.033020749Z" level=info msg="Container c25b09ceba776832b84250ad21c826a52ca07bfbd5e6d81fd77ce09929e891c3: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:02.074867 containerd[1618]: time="2025-08-13T00:34:02.074843272Z" level=info msg="Container 6c1be8e2b87aaed38f7d10967b8ac2450e05920d94ecb27c91bb4d84473a0c0b: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:02.089190 kubelet[2545]: E0813 00:34:02.089164 2545 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.101:6443: connect: connection refused" interval="1.6s" Aug 13 00:34:02.149494 containerd[1618]: time="2025-08-13T00:34:02.149203708Z" level=info msg="CreateContainer within sandbox \"cf434aef604f59b30d8e672eb4043a4105d477d5cd5e0591015d72f05649e238\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"78fc6d600469df9ad911f910b2245e54e778fb201d81c425f06209cdd54ac0f1\"" Aug 13 00:34:02.161333 containerd[1618]: time="2025-08-13T00:34:02.161265469Z" level=info msg="StartContainer for \"78fc6d600469df9ad911f910b2245e54e778fb201d81c425f06209cdd54ac0f1\"" Aug 13 00:34:02.161956 containerd[1618]: time="2025-08-13T00:34:02.161932752Z" level=info msg="connecting to shim 78fc6d600469df9ad911f910b2245e54e778fb201d81c425f06209cdd54ac0f1" address="unix:///run/containerd/s/a99c6982d49964bf49f922895dc84b086fc9463f64211d0e0904a716ab72a8e1" protocol=ttrpc version=3 Aug 13 00:34:02.179627 systemd[1]: Started cri-containerd-78fc6d600469df9ad911f910b2245e54e778fb201d81c425f06209cdd54ac0f1.scope - libcontainer container 78fc6d600469df9ad911f910b2245e54e778fb201d81c425f06209cdd54ac0f1. Aug 13 00:34:02.201882 kubelet[2545]: W0813 00:34:02.201839 2545 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Aug 13 00:34:02.201959 kubelet[2545]: E0813 00:34:02.201885 2545 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:02.235512 kubelet[2545]: W0813 00:34:02.235447 2545 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Aug 13 00:34:02.246965 kubelet[2545]: E0813 00:34:02.235516 2545 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:02.254522 containerd[1618]: time="2025-08-13T00:34:02.254489998Z" level=info msg="StartContainer for \"78fc6d600469df9ad911f910b2245e54e778fb201d81c425f06209cdd54ac0f1\" returns successfully" Aug 13 00:34:02.259129 containerd[1618]: time="2025-08-13T00:34:02.259081038Z" level=info msg="CreateContainer within sandbox \"84c586d323a5aaa999a932129618b856e3aa5af9c2353954de823ecdc4af6f0d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"6c1be8e2b87aaed38f7d10967b8ac2450e05920d94ecb27c91bb4d84473a0c0b\"" Aug 13 00:34:02.259201 containerd[1618]: time="2025-08-13T00:34:02.259173338Z" level=info msg="CreateContainer within sandbox \"8f9ef4c8664d983bd829ffc94a9e0b2824ddf293ddb31f73bbcd6a5311c9277d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c25b09ceba776832b84250ad21c826a52ca07bfbd5e6d81fd77ce09929e891c3\"" Aug 13 00:34:02.259589 containerd[1618]: time="2025-08-13T00:34:02.259575972Z" level=info msg="StartContainer for \"c25b09ceba776832b84250ad21c826a52ca07bfbd5e6d81fd77ce09929e891c3\"" Aug 13 00:34:02.259652 containerd[1618]: time="2025-08-13T00:34:02.259642284Z" level=info msg="StartContainer for \"6c1be8e2b87aaed38f7d10967b8ac2450e05920d94ecb27c91bb4d84473a0c0b\"" Aug 13 00:34:02.260245 containerd[1618]: time="2025-08-13T00:34:02.260230009Z" level=info msg="connecting to shim c25b09ceba776832b84250ad21c826a52ca07bfbd5e6d81fd77ce09929e891c3" address="unix:///run/containerd/s/93cb47d97edb82efa178dbce2b432e57468b0dc65fb5c835b0611a190775c22c" protocol=ttrpc version=3 Aug 13 00:34:02.263713 containerd[1618]: time="2025-08-13T00:34:02.263662294Z" level=info msg="connecting to shim 6c1be8e2b87aaed38f7d10967b8ac2450e05920d94ecb27c91bb4d84473a0c0b" address="unix:///run/containerd/s/2dc0a59508ffbed65e01b0ee166d29be790ef25371b438c9f0e83ee189df8601" protocol=ttrpc version=3 Aug 13 00:34:02.266546 kubelet[2545]: I0813 00:34:02.264172 2545 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Aug 13 00:34:02.266546 kubelet[2545]: E0813 00:34:02.264552 2545 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.101:6443/api/v1/nodes\": dial tcp 139.178.70.101:6443: connect: connection refused" node="localhost" Aug 13 00:34:02.288641 systemd[1]: Started cri-containerd-6c1be8e2b87aaed38f7d10967b8ac2450e05920d94ecb27c91bb4d84473a0c0b.scope - libcontainer container 6c1be8e2b87aaed38f7d10967b8ac2450e05920d94ecb27c91bb4d84473a0c0b. Aug 13 00:34:02.291384 systemd[1]: Started cri-containerd-c25b09ceba776832b84250ad21c826a52ca07bfbd5e6d81fd77ce09929e891c3.scope - libcontainer container c25b09ceba776832b84250ad21c826a52ca07bfbd5e6d81fd77ce09929e891c3. Aug 13 00:34:02.347391 containerd[1618]: time="2025-08-13T00:34:02.347357688Z" level=info msg="StartContainer for \"6c1be8e2b87aaed38f7d10967b8ac2450e05920d94ecb27c91bb4d84473a0c0b\" returns successfully" Aug 13 00:34:02.364944 containerd[1618]: time="2025-08-13T00:34:02.364925845Z" level=info msg="StartContainer for \"c25b09ceba776832b84250ad21c826a52ca07bfbd5e6d81fd77ce09929e891c3\" returns successfully" Aug 13 00:34:02.633023 kubelet[2545]: E0813 00:34:02.632952 2545 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.101:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:34:03.866010 kubelet[2545]: I0813 00:34:03.865983 2545 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Aug 13 00:34:04.270360 kubelet[2545]: E0813 00:34:04.270337 2545 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Aug 13 00:34:04.415495 kubelet[2545]: I0813 00:34:04.415423 2545 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Aug 13 00:34:04.415495 kubelet[2545]: E0813 00:34:04.415455 2545 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Aug 13 00:34:04.447761 kubelet[2545]: E0813 00:34:04.447732 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:04.548668 kubelet[2545]: E0813 00:34:04.548591 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:04.649565 kubelet[2545]: E0813 00:34:04.649519 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:04.750457 kubelet[2545]: E0813 00:34:04.750424 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:04.851183 kubelet[2545]: E0813 00:34:04.851103 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:04.951660 kubelet[2545]: E0813 00:34:04.951626 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:05.052200 kubelet[2545]: E0813 00:34:05.052169 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:05.153024 kubelet[2545]: E0813 00:34:05.153001 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:05.253761 kubelet[2545]: E0813 00:34:05.253732 2545 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:05.665003 kubelet[2545]: I0813 00:34:05.664965 2545 apiserver.go:52] "Watching apiserver" Aug 13 00:34:05.678764 kubelet[2545]: I0813 00:34:05.678722 2545 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 13 00:34:06.344577 systemd[1]: Reload requested from client PID 2816 ('systemctl') (unit session-9.scope)... Aug 13 00:34:06.344590 systemd[1]: Reloading... Aug 13 00:34:06.398599 zram_generator::config[2859]: No configuration found. Aug 13 00:34:06.473993 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:34:06.483281 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Aug 13 00:34:06.574115 systemd[1]: Reloading finished in 229 ms. Aug 13 00:34:06.601593 kubelet[2545]: I0813 00:34:06.601504 2545 dynamic_cafile_content.go:174] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:34:06.601939 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:06.619279 systemd[1]: kubelet.service: Deactivated successfully. Aug 13 00:34:06.619445 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:06.619499 systemd[1]: kubelet.service: Consumed 575ms CPU time, 128.9M memory peak. Aug 13 00:34:06.621039 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:07.414052 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:07.424789 (kubelet)[2927]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:34:07.660983 kubelet[2927]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:07.660983 kubelet[2927]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 13 00:34:07.660983 kubelet[2927]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:07.661230 kubelet[2927]: I0813 00:34:07.661034 2927 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:34:07.666376 kubelet[2927]: I0813 00:34:07.666011 2927 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 13 00:34:07.666376 kubelet[2927]: I0813 00:34:07.666176 2927 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:34:07.667384 kubelet[2927]: I0813 00:34:07.666583 2927 server.go:934] "Client rotation is on, will bootstrap in background" Aug 13 00:34:07.667464 kubelet[2927]: I0813 00:34:07.667455 2927 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Aug 13 00:34:07.678620 kubelet[2927]: I0813 00:34:07.678588 2927 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:34:07.695106 kubelet[2927]: I0813 00:34:07.695092 2927 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 00:34:07.697856 kubelet[2927]: I0813 00:34:07.697843 2927 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:34:07.698056 kubelet[2927]: I0813 00:34:07.698048 2927 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 13 00:34:07.698162 kubelet[2927]: I0813 00:34:07.698147 2927 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:34:07.698328 kubelet[2927]: I0813 00:34:07.698200 2927 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:34:07.698410 kubelet[2927]: I0813 00:34:07.698403 2927 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:34:07.698444 kubelet[2927]: I0813 00:34:07.698440 2927 container_manager_linux.go:300] "Creating device plugin manager" Aug 13 00:34:07.698485 kubelet[2927]: I0813 00:34:07.698481 2927 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:07.698597 kubelet[2927]: I0813 00:34:07.698590 2927 kubelet.go:408] "Attempting to sync node with API server" Aug 13 00:34:07.698917 kubelet[2927]: I0813 00:34:07.698902 2927 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:34:07.699078 kubelet[2927]: I0813 00:34:07.698980 2927 kubelet.go:314] "Adding apiserver pod source" Aug 13 00:34:07.699121 kubelet[2927]: I0813 00:34:07.699115 2927 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:34:07.701342 kubelet[2927]: I0813 00:34:07.701327 2927 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 00:34:07.702658 kubelet[2927]: I0813 00:34:07.701739 2927 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 13 00:34:07.702658 kubelet[2927]: I0813 00:34:07.701963 2927 server.go:1274] "Started kubelet" Aug 13 00:34:07.728099 kubelet[2927]: I0813 00:34:07.728075 2927 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:34:07.739153 kubelet[2927]: I0813 00:34:07.738006 2927 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:34:07.739153 kubelet[2927]: I0813 00:34:07.738670 2927 server.go:449] "Adding debug handlers to kubelet server" Aug 13 00:34:07.739270 kubelet[2927]: I0813 00:34:07.739193 2927 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:34:07.739463 kubelet[2927]: I0813 00:34:07.739307 2927 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:34:07.742121 kubelet[2927]: I0813 00:34:07.741725 2927 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:34:07.743645 kubelet[2927]: I0813 00:34:07.743625 2927 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 13 00:34:07.746077 kubelet[2927]: E0813 00:34:07.745641 2927 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 13 00:34:07.748877 kubelet[2927]: I0813 00:34:07.748799 2927 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 13 00:34:07.749302 kubelet[2927]: I0813 00:34:07.749232 2927 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:34:07.761354 kubelet[2927]: E0813 00:34:07.761317 2927 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 00:34:07.764945 kubelet[2927]: I0813 00:34:07.764035 2927 factory.go:221] Registration of the containerd container factory successfully Aug 13 00:34:07.765094 kubelet[2927]: I0813 00:34:07.765027 2927 factory.go:221] Registration of the systemd container factory successfully Aug 13 00:34:07.765257 kubelet[2927]: I0813 00:34:07.765221 2927 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:34:07.780970 kubelet[2927]: I0813 00:34:07.780311 2927 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 13 00:34:07.788799 kubelet[2927]: I0813 00:34:07.787697 2927 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 13 00:34:07.788799 kubelet[2927]: I0813 00:34:07.787717 2927 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 13 00:34:07.788799 kubelet[2927]: I0813 00:34:07.787731 2927 kubelet.go:2321] "Starting kubelet main sync loop" Aug 13 00:34:07.788799 kubelet[2927]: E0813 00:34:07.787764 2927 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:34:07.820660 kubelet[2927]: I0813 00:34:07.820639 2927 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 13 00:34:07.820660 kubelet[2927]: I0813 00:34:07.820652 2927 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 13 00:34:07.820660 kubelet[2927]: I0813 00:34:07.820667 2927 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:07.820796 kubelet[2927]: I0813 00:34:07.820781 2927 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 13 00:34:07.820836 kubelet[2927]: I0813 00:34:07.820794 2927 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 13 00:34:07.820836 kubelet[2927]: I0813 00:34:07.820810 2927 policy_none.go:49] "None policy: Start" Aug 13 00:34:07.821162 kubelet[2927]: I0813 00:34:07.821132 2927 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 13 00:34:07.821162 kubelet[2927]: I0813 00:34:07.821145 2927 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:34:07.821389 kubelet[2927]: I0813 00:34:07.821346 2927 state_mem.go:75] "Updated machine memory state" Aug 13 00:34:07.824441 kubelet[2927]: I0813 00:34:07.824425 2927 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 13 00:34:07.826164 kubelet[2927]: I0813 00:34:07.826155 2927 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:34:07.826285 kubelet[2927]: I0813 00:34:07.826261 2927 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:34:07.827012 kubelet[2927]: I0813 00:34:07.827003 2927 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:34:07.928237 kubelet[2927]: I0813 00:34:07.928122 2927 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Aug 13 00:34:07.996690 kubelet[2927]: I0813 00:34:07.996661 2927 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Aug 13 00:34:07.996797 kubelet[2927]: I0813 00:34:07.996717 2927 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Aug 13 00:34:08.050901 kubelet[2927]: I0813 00:34:08.050869 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:08.050901 kubelet[2927]: I0813 00:34:08.050899 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:08.051016 kubelet[2927]: I0813 00:34:08.050911 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:08.051016 kubelet[2927]: I0813 00:34:08.050923 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:08.051016 kubelet[2927]: I0813 00:34:08.050934 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/407c569889bb86d746b0274843003fd0-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"407c569889bb86d746b0274843003fd0\") " pod="kube-system/kube-controller-manager-localhost" Aug 13 00:34:08.051016 kubelet[2927]: I0813 00:34:08.050943 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/30861c3376c0e7b1c34a6b7ec3541676-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"30861c3376c0e7b1c34a6b7ec3541676\") " pod="kube-system/kube-apiserver-localhost" Aug 13 00:34:08.051016 kubelet[2927]: I0813 00:34:08.050952 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/30861c3376c0e7b1c34a6b7ec3541676-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"30861c3376c0e7b1c34a6b7ec3541676\") " pod="kube-system/kube-apiserver-localhost" Aug 13 00:34:08.051112 kubelet[2927]: I0813 00:34:08.050960 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/27e4a50e94f48ec00f6bd509cb48ed05-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"27e4a50e94f48ec00f6bd509cb48ed05\") " pod="kube-system/kube-scheduler-localhost" Aug 13 00:34:08.051112 kubelet[2927]: I0813 00:34:08.050968 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/30861c3376c0e7b1c34a6b7ec3541676-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"30861c3376c0e7b1c34a6b7ec3541676\") " pod="kube-system/kube-apiserver-localhost" Aug 13 00:34:08.700803 kubelet[2927]: I0813 00:34:08.700658 2927 apiserver.go:52] "Watching apiserver" Aug 13 00:34:08.749488 kubelet[2927]: I0813 00:34:08.749454 2927 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 13 00:34:08.826577 kubelet[2927]: I0813 00:34:08.826514 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.826500617 podStartE2EDuration="1.826500617s" podCreationTimestamp="2025-08-13 00:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:08.826077126 +0000 UTC m=+1.243353163" watchObservedRunningTime="2025-08-13 00:34:08.826500617 +0000 UTC m=+1.243776652" Aug 13 00:34:08.837547 kubelet[2927]: I0813 00:34:08.837404 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.8372902500000001 podStartE2EDuration="1.83729025s" podCreationTimestamp="2025-08-13 00:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:08.831508257 +0000 UTC m=+1.248784288" watchObservedRunningTime="2025-08-13 00:34:08.83729025 +0000 UTC m=+1.254566282" Aug 13 00:34:08.837918 kubelet[2927]: I0813 00:34:08.837761 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.8377514480000001 podStartE2EDuration="1.837751448s" podCreationTimestamp="2025-08-13 00:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:08.836665336 +0000 UTC m=+1.253941372" watchObservedRunningTime="2025-08-13 00:34:08.837751448 +0000 UTC m=+1.255027486" Aug 13 00:34:11.676386 kubelet[2927]: I0813 00:34:11.676313 2927 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 13 00:34:11.677241 containerd[1618]: time="2025-08-13T00:34:11.677096728Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 13 00:34:11.677919 kubelet[2927]: I0813 00:34:11.677453 2927 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 13 00:34:12.419029 systemd[1]: Created slice kubepods-besteffort-pod715644a7_ac45_4a42_a0de_db6e159fd1c3.slice - libcontainer container kubepods-besteffort-pod715644a7_ac45_4a42_a0de_db6e159fd1c3.slice. Aug 13 00:34:12.481627 kubelet[2927]: I0813 00:34:12.481597 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/715644a7-ac45-4a42-a0de-db6e159fd1c3-kube-proxy\") pod \"kube-proxy-49f99\" (UID: \"715644a7-ac45-4a42-a0de-db6e159fd1c3\") " pod="kube-system/kube-proxy-49f99" Aug 13 00:34:12.481627 kubelet[2927]: I0813 00:34:12.481630 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/715644a7-ac45-4a42-a0de-db6e159fd1c3-xtables-lock\") pod \"kube-proxy-49f99\" (UID: \"715644a7-ac45-4a42-a0de-db6e159fd1c3\") " pod="kube-system/kube-proxy-49f99" Aug 13 00:34:12.481753 kubelet[2927]: I0813 00:34:12.481644 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dnnk\" (UniqueName: \"kubernetes.io/projected/715644a7-ac45-4a42-a0de-db6e159fd1c3-kube-api-access-6dnnk\") pod \"kube-proxy-49f99\" (UID: \"715644a7-ac45-4a42-a0de-db6e159fd1c3\") " pod="kube-system/kube-proxy-49f99" Aug 13 00:34:12.481753 kubelet[2927]: I0813 00:34:12.481659 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/715644a7-ac45-4a42-a0de-db6e159fd1c3-lib-modules\") pod \"kube-proxy-49f99\" (UID: \"715644a7-ac45-4a42-a0de-db6e159fd1c3\") " pod="kube-system/kube-proxy-49f99" Aug 13 00:34:12.613294 systemd[1]: Created slice kubepods-besteffort-poda8c16420_8d46_42db_ac5c_6836f08825fa.slice - libcontainer container kubepods-besteffort-poda8c16420_8d46_42db_ac5c_6836f08825fa.slice. Aug 13 00:34:12.614321 kubelet[2927]: W0813 00:34:12.614279 2927 reflector.go:561] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Aug 13 00:34:12.614321 kubelet[2927]: E0813 00:34:12.614306 2927 reflector.go:158] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Aug 13 00:34:12.682372 kubelet[2927]: I0813 00:34:12.682250 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a8c16420-8d46-42db-ac5c-6836f08825fa-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-5cf8j\" (UID: \"a8c16420-8d46-42db-ac5c-6836f08825fa\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-5cf8j" Aug 13 00:34:12.682372 kubelet[2927]: I0813 00:34:12.682332 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnrkr\" (UniqueName: \"kubernetes.io/projected/a8c16420-8d46-42db-ac5c-6836f08825fa-kube-api-access-pnrkr\") pod \"tigera-operator-5bf8dfcb4-5cf8j\" (UID: \"a8c16420-8d46-42db-ac5c-6836f08825fa\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-5cf8j" Aug 13 00:34:12.732323 containerd[1618]: time="2025-08-13T00:34:12.732196862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-49f99,Uid:715644a7-ac45-4a42-a0de-db6e159fd1c3,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:12.747974 containerd[1618]: time="2025-08-13T00:34:12.747945087Z" level=info msg="connecting to shim 917e4bc98eee452f628f0a7a1db57e79846dcf0df5ccd363496afec7dc026aa6" address="unix:///run/containerd/s/c7714d4720c737bd7cb4335d34d5734bc91b9fc7aa90197bd8ab89423a2c69df" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:12.763690 systemd[1]: Started cri-containerd-917e4bc98eee452f628f0a7a1db57e79846dcf0df5ccd363496afec7dc026aa6.scope - libcontainer container 917e4bc98eee452f628f0a7a1db57e79846dcf0df5ccd363496afec7dc026aa6. Aug 13 00:34:12.780883 containerd[1618]: time="2025-08-13T00:34:12.780857407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-49f99,Uid:715644a7-ac45-4a42-a0de-db6e159fd1c3,Namespace:kube-system,Attempt:0,} returns sandbox id \"917e4bc98eee452f628f0a7a1db57e79846dcf0df5ccd363496afec7dc026aa6\"" Aug 13 00:34:12.783883 containerd[1618]: time="2025-08-13T00:34:12.783854816Z" level=info msg="CreateContainer within sandbox \"917e4bc98eee452f628f0a7a1db57e79846dcf0df5ccd363496afec7dc026aa6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 13 00:34:12.829329 containerd[1618]: time="2025-08-13T00:34:12.829306555Z" level=info msg="Container f80cd647a4d61b0a394d1e1edd23bf2f2b7cacf72b0cd4eb81699213608490f6: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:12.864198 containerd[1618]: time="2025-08-13T00:34:12.864165388Z" level=info msg="CreateContainer within sandbox \"917e4bc98eee452f628f0a7a1db57e79846dcf0df5ccd363496afec7dc026aa6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f80cd647a4d61b0a394d1e1edd23bf2f2b7cacf72b0cd4eb81699213608490f6\"" Aug 13 00:34:12.865014 containerd[1618]: time="2025-08-13T00:34:12.864932672Z" level=info msg="StartContainer for \"f80cd647a4d61b0a394d1e1edd23bf2f2b7cacf72b0cd4eb81699213608490f6\"" Aug 13 00:34:12.866096 containerd[1618]: time="2025-08-13T00:34:12.866071640Z" level=info msg="connecting to shim f80cd647a4d61b0a394d1e1edd23bf2f2b7cacf72b0cd4eb81699213608490f6" address="unix:///run/containerd/s/c7714d4720c737bd7cb4335d34d5734bc91b9fc7aa90197bd8ab89423a2c69df" protocol=ttrpc version=3 Aug 13 00:34:12.890674 systemd[1]: Started cri-containerd-f80cd647a4d61b0a394d1e1edd23bf2f2b7cacf72b0cd4eb81699213608490f6.scope - libcontainer container f80cd647a4d61b0a394d1e1edd23bf2f2b7cacf72b0cd4eb81699213608490f6. Aug 13 00:34:12.916313 containerd[1618]: time="2025-08-13T00:34:12.915993977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-5cf8j,Uid:a8c16420-8d46-42db-ac5c-6836f08825fa,Namespace:tigera-operator,Attempt:0,}" Aug 13 00:34:12.916577 containerd[1618]: time="2025-08-13T00:34:12.916525753Z" level=info msg="StartContainer for \"f80cd647a4d61b0a394d1e1edd23bf2f2b7cacf72b0cd4eb81699213608490f6\" returns successfully" Aug 13 00:34:12.929913 containerd[1618]: time="2025-08-13T00:34:12.929876825Z" level=info msg="connecting to shim 5956dc84707b24a28f22cc03c1615c100b21d211c773161fbbd8f878aad67477" address="unix:///run/containerd/s/6a80a814bf7b32df9c33c38978b1f8a6823bb78fd9b8b8601630c1e077401bef" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:12.949658 systemd[1]: Started cri-containerd-5956dc84707b24a28f22cc03c1615c100b21d211c773161fbbd8f878aad67477.scope - libcontainer container 5956dc84707b24a28f22cc03c1615c100b21d211c773161fbbd8f878aad67477. Aug 13 00:34:12.989725 containerd[1618]: time="2025-08-13T00:34:12.989701944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-5cf8j,Uid:a8c16420-8d46-42db-ac5c-6836f08825fa,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"5956dc84707b24a28f22cc03c1615c100b21d211c773161fbbd8f878aad67477\"" Aug 13 00:34:12.990911 containerd[1618]: time="2025-08-13T00:34:12.990809792Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 13 00:34:13.607182 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount26631823.mount: Deactivated successfully. Aug 13 00:34:13.892310 kubelet[2927]: I0813 00:34:13.892137 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-49f99" podStartSLOduration=1.892119685 podStartE2EDuration="1.892119685s" podCreationTimestamp="2025-08-13 00:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:13.824383124 +0000 UTC m=+6.241659162" watchObservedRunningTime="2025-08-13 00:34:13.892119685 +0000 UTC m=+6.309395723" Aug 13 00:34:14.327113 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount964359661.mount: Deactivated successfully. Aug 13 00:34:14.969193 containerd[1618]: time="2025-08-13T00:34:14.968687826Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:14.979655 containerd[1618]: time="2025-08-13T00:34:14.979637187Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 13 00:34:14.987061 containerd[1618]: time="2025-08-13T00:34:14.987041047Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:14.997344 containerd[1618]: time="2025-08-13T00:34:14.997308834Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:14.997846 containerd[1618]: time="2025-08-13T00:34:14.997697686Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.006869612s" Aug 13 00:34:14.997846 containerd[1618]: time="2025-08-13T00:34:14.997791150Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 13 00:34:14.999904 containerd[1618]: time="2025-08-13T00:34:14.999882346Z" level=info msg="CreateContainer within sandbox \"5956dc84707b24a28f22cc03c1615c100b21d211c773161fbbd8f878aad67477\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 13 00:34:15.034594 containerd[1618]: time="2025-08-13T00:34:15.034552400Z" level=info msg="Container a3e2c31fe79f80d71018e50e2860f35222b8b75ae89ad8058f345236e9a0fefd: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:15.036769 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount873895505.mount: Deactivated successfully. Aug 13 00:34:15.050807 containerd[1618]: time="2025-08-13T00:34:15.050747978Z" level=info msg="CreateContainer within sandbox \"5956dc84707b24a28f22cc03c1615c100b21d211c773161fbbd8f878aad67477\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a3e2c31fe79f80d71018e50e2860f35222b8b75ae89ad8058f345236e9a0fefd\"" Aug 13 00:34:15.051034 containerd[1618]: time="2025-08-13T00:34:15.050996161Z" level=info msg="StartContainer for \"a3e2c31fe79f80d71018e50e2860f35222b8b75ae89ad8058f345236e9a0fefd\"" Aug 13 00:34:15.051635 containerd[1618]: time="2025-08-13T00:34:15.051620560Z" level=info msg="connecting to shim a3e2c31fe79f80d71018e50e2860f35222b8b75ae89ad8058f345236e9a0fefd" address="unix:///run/containerd/s/6a80a814bf7b32df9c33c38978b1f8a6823bb78fd9b8b8601630c1e077401bef" protocol=ttrpc version=3 Aug 13 00:34:15.076683 systemd[1]: Started cri-containerd-a3e2c31fe79f80d71018e50e2860f35222b8b75ae89ad8058f345236e9a0fefd.scope - libcontainer container a3e2c31fe79f80d71018e50e2860f35222b8b75ae89ad8058f345236e9a0fefd. Aug 13 00:34:15.099924 containerd[1618]: time="2025-08-13T00:34:15.099894860Z" level=info msg="StartContainer for \"a3e2c31fe79f80d71018e50e2860f35222b8b75ae89ad8058f345236e9a0fefd\" returns successfully" Aug 13 00:34:16.629761 kubelet[2927]: I0813 00:34:16.629720 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-5cf8j" podStartSLOduration=2.621478958 podStartE2EDuration="4.629705221s" podCreationTimestamp="2025-08-13 00:34:12 +0000 UTC" firstStartedPulling="2025-08-13 00:34:12.990444522 +0000 UTC m=+5.407720551" lastFinishedPulling="2025-08-13 00:34:14.998670781 +0000 UTC m=+7.415946814" observedRunningTime="2025-08-13 00:34:15.824019359 +0000 UTC m=+8.241295398" watchObservedRunningTime="2025-08-13 00:34:16.629705221 +0000 UTC m=+9.046981251" Aug 13 00:34:20.578096 sudo[1954]: pam_unix(sudo:session): session closed for user root Aug 13 00:34:20.579088 sshd[1953]: Connection closed by 147.75.109.163 port 44828 Aug 13 00:34:20.580040 sshd-session[1951]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:20.582416 systemd[1]: sshd@6-139.178.70.101:22-147.75.109.163:44828.service: Deactivated successfully. Aug 13 00:34:20.584598 systemd[1]: session-9.scope: Deactivated successfully. Aug 13 00:34:20.585021 systemd[1]: session-9.scope: Consumed 3.346s CPU time, 150.6M memory peak. Aug 13 00:34:20.588201 systemd-logind[1593]: Session 9 logged out. Waiting for processes to exit. Aug 13 00:34:20.591355 systemd-logind[1593]: Removed session 9. Aug 13 00:34:23.224926 systemd[1]: Created slice kubepods-besteffort-podd92ea200_576a_416d_85cc_b34e6fded67d.slice - libcontainer container kubepods-besteffort-podd92ea200_576a_416d_85cc_b34e6fded67d.slice. Aug 13 00:34:23.263896 kubelet[2927]: I0813 00:34:23.263865 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcwvz\" (UniqueName: \"kubernetes.io/projected/d92ea200-576a-416d-85cc-b34e6fded67d-kube-api-access-rcwvz\") pod \"calico-typha-6b75c7566b-92bpq\" (UID: \"d92ea200-576a-416d-85cc-b34e6fded67d\") " pod="calico-system/calico-typha-6b75c7566b-92bpq" Aug 13 00:34:23.263896 kubelet[2927]: I0813 00:34:23.263902 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92ea200-576a-416d-85cc-b34e6fded67d-tigera-ca-bundle\") pod \"calico-typha-6b75c7566b-92bpq\" (UID: \"d92ea200-576a-416d-85cc-b34e6fded67d\") " pod="calico-system/calico-typha-6b75c7566b-92bpq" Aug 13 00:34:23.264238 kubelet[2927]: I0813 00:34:23.263916 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d92ea200-576a-416d-85cc-b34e6fded67d-typha-certs\") pod \"calico-typha-6b75c7566b-92bpq\" (UID: \"d92ea200-576a-416d-85cc-b34e6fded67d\") " pod="calico-system/calico-typha-6b75c7566b-92bpq" Aug 13 00:34:23.529904 containerd[1618]: time="2025-08-13T00:34:23.529642550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b75c7566b-92bpq,Uid:d92ea200-576a-416d-85cc-b34e6fded67d,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:23.552429 containerd[1618]: time="2025-08-13T00:34:23.552051758Z" level=info msg="connecting to shim 51a2249139438d59d96630ca8815dee2fb58632d29a730c91b126ef44e7f58e7" address="unix:///run/containerd/s/4ae91ee396fcaca77cacd3e33bfcc1f8b2da06d9d1341f3c2c5b9cbc79ea9802" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:23.576703 systemd[1]: Started cri-containerd-51a2249139438d59d96630ca8815dee2fb58632d29a730c91b126ef44e7f58e7.scope - libcontainer container 51a2249139438d59d96630ca8815dee2fb58632d29a730c91b126ef44e7f58e7. Aug 13 00:34:23.635096 containerd[1618]: time="2025-08-13T00:34:23.635017995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b75c7566b-92bpq,Uid:d92ea200-576a-416d-85cc-b34e6fded67d,Namespace:calico-system,Attempt:0,} returns sandbox id \"51a2249139438d59d96630ca8815dee2fb58632d29a730c91b126ef44e7f58e7\"" Aug 13 00:34:23.636113 containerd[1618]: time="2025-08-13T00:34:23.636089385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 13 00:34:23.659355 systemd[1]: Created slice kubepods-besteffort-pod58e20f13_f132_4b4b_b2e6_b5b2748503dd.slice - libcontainer container kubepods-besteffort-pod58e20f13_f132_4b4b_b2e6_b5b2748503dd.slice. Aug 13 00:34:23.666709 kubelet[2927]: I0813 00:34:23.666656 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-cni-log-dir\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.666709 kubelet[2927]: I0813 00:34:23.666682 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/58e20f13-f132-4b4b-b2e6-b5b2748503dd-node-certs\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.666914 kubelet[2927]: I0813 00:34:23.666805 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-var-lib-calico\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.666914 kubelet[2927]: I0813 00:34:23.666819 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-policysync\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.666914 kubelet[2927]: I0813 00:34:23.666828 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-flexvol-driver-host\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.666914 kubelet[2927]: I0813 00:34:23.666839 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-cni-bin-dir\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.667101 kubelet[2927]: I0813 00:34:23.666849 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e20f13-f132-4b4b-b2e6-b5b2748503dd-tigera-ca-bundle\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.667101 kubelet[2927]: I0813 00:34:23.667011 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-xtables-lock\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.667291 kubelet[2927]: I0813 00:34:23.667168 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-lib-modules\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.667291 kubelet[2927]: I0813 00:34:23.667192 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-var-run-calico\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.667400 kubelet[2927]: I0813 00:34:23.667204 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r47l\" (UniqueName: \"kubernetes.io/projected/58e20f13-f132-4b4b-b2e6-b5b2748503dd-kube-api-access-5r47l\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.667400 kubelet[2927]: I0813 00:34:23.667381 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/58e20f13-f132-4b4b-b2e6-b5b2748503dd-cni-net-dir\") pod \"calico-node-c69tb\" (UID: \"58e20f13-f132-4b4b-b2e6-b5b2748503dd\") " pod="calico-system/calico-node-c69tb" Aug 13 00:34:23.779400 kubelet[2927]: E0813 00:34:23.779379 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.779400 kubelet[2927]: W0813 00:34:23.779397 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.779513 kubelet[2927]: E0813 00:34:23.779413 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.926371 kubelet[2927]: E0813 00:34:23.926180 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-24x4b" podUID="13ac1461-2756-4b70-ab78-2c6b6a45f530" Aug 13 00:34:23.961900 containerd[1618]: time="2025-08-13T00:34:23.961872526Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-c69tb,Uid:58e20f13-f132-4b4b-b2e6-b5b2748503dd,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:23.965227 kubelet[2927]: E0813 00:34:23.965163 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.965227 kubelet[2927]: W0813 00:34:23.965179 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.965227 kubelet[2927]: E0813 00:34:23.965201 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.965879 kubelet[2927]: E0813 00:34:23.965485 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.965879 kubelet[2927]: W0813 00:34:23.965491 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.965879 kubelet[2927]: E0813 00:34:23.965498 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.965879 kubelet[2927]: E0813 00:34:23.965615 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.965879 kubelet[2927]: W0813 00:34:23.965623 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.965879 kubelet[2927]: E0813 00:34:23.965628 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.965879 kubelet[2927]: E0813 00:34:23.965733 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.965879 kubelet[2927]: W0813 00:34:23.965748 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.965879 kubelet[2927]: E0813 00:34:23.965756 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.966231 kubelet[2927]: E0813 00:34:23.966195 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.966231 kubelet[2927]: W0813 00:34:23.966202 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.966231 kubelet[2927]: E0813 00:34:23.966208 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.966443 kubelet[2927]: E0813 00:34:23.966409 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.966443 kubelet[2927]: W0813 00:34:23.966416 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.966443 kubelet[2927]: E0813 00:34:23.966423 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.966742 kubelet[2927]: E0813 00:34:23.966687 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.966742 kubelet[2927]: W0813 00:34:23.966695 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.966742 kubelet[2927]: E0813 00:34:23.966703 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.966950 kubelet[2927]: E0813 00:34:23.966943 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.967023 kubelet[2927]: W0813 00:34:23.966985 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.967023 kubelet[2927]: E0813 00:34:23.966996 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.967194 kubelet[2927]: E0813 00:34:23.967188 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.967274 kubelet[2927]: W0813 00:34:23.967227 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.967274 kubelet[2927]: E0813 00:34:23.967238 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.967474 kubelet[2927]: E0813 00:34:23.967409 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.967474 kubelet[2927]: W0813 00:34:23.967418 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.967474 kubelet[2927]: E0813 00:34:23.967427 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.967639 kubelet[2927]: E0813 00:34:23.967595 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.967639 kubelet[2927]: W0813 00:34:23.967604 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.967639 kubelet[2927]: E0813 00:34:23.967612 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.967822 kubelet[2927]: E0813 00:34:23.967787 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.967822 kubelet[2927]: W0813 00:34:23.967793 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.967822 kubelet[2927]: E0813 00:34:23.967799 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.968058 kubelet[2927]: E0813 00:34:23.968049 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.968158 kubelet[2927]: W0813 00:34:23.968109 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.968158 kubelet[2927]: E0813 00:34:23.968122 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.968382 kubelet[2927]: E0813 00:34:23.968347 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.968382 kubelet[2927]: W0813 00:34:23.968354 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.968382 kubelet[2927]: E0813 00:34:23.968360 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.968627 kubelet[2927]: E0813 00:34:23.968557 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.968627 kubelet[2927]: W0813 00:34:23.968563 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.968627 kubelet[2927]: E0813 00:34:23.968571 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.968775 kubelet[2927]: E0813 00:34:23.968748 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.968775 kubelet[2927]: W0813 00:34:23.968755 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.968775 kubelet[2927]: E0813 00:34:23.968763 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.968981 kubelet[2927]: E0813 00:34:23.968975 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.969033 kubelet[2927]: W0813 00:34:23.969027 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.969071 kubelet[2927]: E0813 00:34:23.969063 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.969203 kubelet[2927]: E0813 00:34:23.969197 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.969258 kubelet[2927]: W0813 00:34:23.969246 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.969329 kubelet[2927]: E0813 00:34:23.969298 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.969489 kubelet[2927]: E0813 00:34:23.969483 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.969596 kubelet[2927]: W0813 00:34:23.969526 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.969596 kubelet[2927]: E0813 00:34:23.969570 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.969934 kubelet[2927]: E0813 00:34:23.969843 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.969934 kubelet[2927]: W0813 00:34:23.969852 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.969934 kubelet[2927]: E0813 00:34:23.969859 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.970112 kubelet[2927]: E0813 00:34:23.970106 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.970326 kubelet[2927]: W0813 00:34:23.970157 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.970326 kubelet[2927]: E0813 00:34:23.970167 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.970415 kubelet[2927]: I0813 00:34:23.970397 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13ac1461-2756-4b70-ab78-2c6b6a45f530-kubelet-dir\") pod \"csi-node-driver-24x4b\" (UID: \"13ac1461-2756-4b70-ab78-2c6b6a45f530\") " pod="calico-system/csi-node-driver-24x4b" Aug 13 00:34:23.970606 kubelet[2927]: E0813 00:34:23.970478 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.970606 kubelet[2927]: W0813 00:34:23.970564 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.970606 kubelet[2927]: E0813 00:34:23.970575 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.970854 kubelet[2927]: E0813 00:34:23.970779 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.970903 kubelet[2927]: W0813 00:34:23.970892 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.970968 kubelet[2927]: E0813 00:34:23.970938 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.971165 kubelet[2927]: E0813 00:34:23.971134 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.971165 kubelet[2927]: W0813 00:34:23.971142 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.971165 kubelet[2927]: E0813 00:34:23.971149 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.971395 kubelet[2927]: I0813 00:34:23.971353 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/13ac1461-2756-4b70-ab78-2c6b6a45f530-registration-dir\") pod \"csi-node-driver-24x4b\" (UID: \"13ac1461-2756-4b70-ab78-2c6b6a45f530\") " pod="calico-system/csi-node-driver-24x4b" Aug 13 00:34:23.971446 kubelet[2927]: E0813 00:34:23.971439 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.971498 kubelet[2927]: W0813 00:34:23.971486 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.971678 kubelet[2927]: E0813 00:34:23.971533 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.971776 kubelet[2927]: E0813 00:34:23.971769 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.971823 kubelet[2927]: W0813 00:34:23.971808 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.971943 kubelet[2927]: E0813 00:34:23.971904 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.972077 kubelet[2927]: E0813 00:34:23.972052 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.972077 kubelet[2927]: W0813 00:34:23.972059 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.972077 kubelet[2927]: E0813 00:34:23.972064 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.972228 kubelet[2927]: I0813 00:34:23.972169 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/13ac1461-2756-4b70-ab78-2c6b6a45f530-varrun\") pod \"csi-node-driver-24x4b\" (UID: \"13ac1461-2756-4b70-ab78-2c6b6a45f530\") " pod="calico-system/csi-node-driver-24x4b" Aug 13 00:34:23.972379 kubelet[2927]: E0813 00:34:23.972351 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.972379 kubelet[2927]: W0813 00:34:23.972361 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.972502 kubelet[2927]: E0813 00:34:23.972438 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.972502 kubelet[2927]: I0813 00:34:23.972453 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dq8g\" (UniqueName: \"kubernetes.io/projected/13ac1461-2756-4b70-ab78-2c6b6a45f530-kube-api-access-4dq8g\") pod \"csi-node-driver-24x4b\" (UID: \"13ac1461-2756-4b70-ab78-2c6b6a45f530\") " pod="calico-system/csi-node-driver-24x4b" Aug 13 00:34:23.972723 kubelet[2927]: E0813 00:34:23.972704 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.972723 kubelet[2927]: W0813 00:34:23.972714 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.973085 kubelet[2927]: E0813 00:34:23.972800 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.973085 kubelet[2927]: I0813 00:34:23.972818 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/13ac1461-2756-4b70-ab78-2c6b6a45f530-socket-dir\") pod \"csi-node-driver-24x4b\" (UID: \"13ac1461-2756-4b70-ab78-2c6b6a45f530\") " pod="calico-system/csi-node-driver-24x4b" Aug 13 00:34:23.973228 kubelet[2927]: E0813 00:34:23.973185 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.973228 kubelet[2927]: W0813 00:34:23.973193 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.973228 kubelet[2927]: E0813 00:34:23.973206 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.973407 kubelet[2927]: E0813 00:34:23.973399 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.973482 kubelet[2927]: W0813 00:34:23.973445 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.973530 kubelet[2927]: E0813 00:34:23.973521 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.973670 kubelet[2927]: E0813 00:34:23.973664 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.973710 kubelet[2927]: W0813 00:34:23.973704 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.974026 kubelet[2927]: E0813 00:34:23.974012 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.974163 kubelet[2927]: E0813 00:34:23.974155 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.974211 containerd[1618]: time="2025-08-13T00:34:23.974186585Z" level=info msg="connecting to shim 7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b" address="unix:///run/containerd/s/284d7c9448f8be7511d753ed008e900f34e342f3f010c834a8fa16cedf5622cd" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:23.974258 kubelet[2927]: W0813 00:34:23.974245 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.974310 kubelet[2927]: E0813 00:34:23.974300 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.974455 kubelet[2927]: E0813 00:34:23.974448 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.974503 kubelet[2927]: W0813 00:34:23.974496 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.974570 kubelet[2927]: E0813 00:34:23.974563 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.974772 kubelet[2927]: E0813 00:34:23.974745 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:23.974772 kubelet[2927]: W0813 00:34:23.974752 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:23.974772 kubelet[2927]: E0813 00:34:23.974758 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:23.993797 systemd[1]: Started cri-containerd-7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b.scope - libcontainer container 7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b. Aug 13 00:34:24.048803 containerd[1618]: time="2025-08-13T00:34:24.048773313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-c69tb,Uid:58e20f13-f132-4b4b-b2e6-b5b2748503dd,Namespace:calico-system,Attempt:0,} returns sandbox id \"7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b\"" Aug 13 00:34:24.073892 kubelet[2927]: E0813 00:34:24.073783 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.073892 kubelet[2927]: W0813 00:34:24.073801 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.073892 kubelet[2927]: E0813 00:34:24.073817 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.074281 kubelet[2927]: E0813 00:34:24.073985 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.074281 kubelet[2927]: W0813 00:34:24.073991 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.074281 kubelet[2927]: E0813 00:34:24.074000 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.074281 kubelet[2927]: E0813 00:34:24.074104 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.074281 kubelet[2927]: W0813 00:34:24.074110 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.074281 kubelet[2927]: E0813 00:34:24.074122 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.074281 kubelet[2927]: E0813 00:34:24.074225 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.074281 kubelet[2927]: W0813 00:34:24.074230 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.074281 kubelet[2927]: E0813 00:34:24.074238 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.074732 kubelet[2927]: E0813 00:34:24.074683 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.074732 kubelet[2927]: W0813 00:34:24.074695 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.074910 kubelet[2927]: E0813 00:34:24.074827 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.074910 kubelet[2927]: E0813 00:34:24.074868 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.074910 kubelet[2927]: W0813 00:34:24.074873 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.074910 kubelet[2927]: E0813 00:34:24.074882 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.075396 kubelet[2927]: E0813 00:34:24.075379 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.075396 kubelet[2927]: W0813 00:34:24.075389 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.075396 kubelet[2927]: E0813 00:34:24.075402 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.075694 kubelet[2927]: E0813 00:34:24.075682 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.075694 kubelet[2927]: W0813 00:34:24.075691 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.075873 kubelet[2927]: E0813 00:34:24.075704 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.075983 kubelet[2927]: E0813 00:34:24.075927 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.075983 kubelet[2927]: W0813 00:34:24.075933 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.075983 kubelet[2927]: E0813 00:34:24.075941 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.076137 kubelet[2927]: E0813 00:34:24.076058 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.076137 kubelet[2927]: W0813 00:34:24.076064 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.076295 kubelet[2927]: E0813 00:34:24.076147 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.076295 kubelet[2927]: W0813 00:34:24.076154 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.076295 kubelet[2927]: E0813 00:34:24.076252 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.076295 kubelet[2927]: E0813 00:34:24.076273 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.076600 kubelet[2927]: E0813 00:34:24.076381 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.076600 kubelet[2927]: W0813 00:34:24.076388 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.076600 kubelet[2927]: E0813 00:34:24.076398 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.076875 kubelet[2927]: E0813 00:34:24.076862 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.077011 kubelet[2927]: W0813 00:34:24.076876 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.077011 kubelet[2927]: E0813 00:34:24.076887 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.077207 kubelet[2927]: E0813 00:34:24.076997 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.077207 kubelet[2927]: W0813 00:34:24.077204 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.077348 kubelet[2927]: E0813 00:34:24.077334 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.077430 kubelet[2927]: E0813 00:34:24.077351 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.077500 kubelet[2927]: W0813 00:34:24.077490 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.077709 kubelet[2927]: E0813 00:34:24.077687 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.077900 kubelet[2927]: E0813 00:34:24.077873 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.077900 kubelet[2927]: W0813 00:34:24.077883 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.078075 kubelet[2927]: E0813 00:34:24.078057 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.078160 kubelet[2927]: E0813 00:34:24.078151 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.078281 kubelet[2927]: W0813 00:34:24.078213 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.078281 kubelet[2927]: E0813 00:34:24.078238 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.078511 kubelet[2927]: E0813 00:34:24.078463 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.078511 kubelet[2927]: W0813 00:34:24.078472 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.078511 kubelet[2927]: E0813 00:34:24.078486 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.078694 kubelet[2927]: E0813 00:34:24.078651 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.078694 kubelet[2927]: W0813 00:34:24.078657 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.078694 kubelet[2927]: E0813 00:34:24.078664 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.078829 kubelet[2927]: E0813 00:34:24.078750 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.078829 kubelet[2927]: W0813 00:34:24.078755 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.078829 kubelet[2927]: E0813 00:34:24.078762 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.078923 kubelet[2927]: E0813 00:34:24.078861 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.078923 kubelet[2927]: W0813 00:34:24.078866 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.078923 kubelet[2927]: E0813 00:34:24.078872 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.079159 kubelet[2927]: E0813 00:34:24.079083 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.079159 kubelet[2927]: W0813 00:34:24.079091 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.079159 kubelet[2927]: E0813 00:34:24.079097 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.079527 kubelet[2927]: E0813 00:34:24.079213 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.079527 kubelet[2927]: W0813 00:34:24.079222 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.079527 kubelet[2927]: E0813 00:34:24.079231 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.079854 kubelet[2927]: E0813 00:34:24.079664 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.079854 kubelet[2927]: W0813 00:34:24.079672 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.079854 kubelet[2927]: E0813 00:34:24.079684 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.080169 kubelet[2927]: E0813 00:34:24.080160 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.080265 kubelet[2927]: W0813 00:34:24.080211 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.080265 kubelet[2927]: E0813 00:34:24.080223 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.086823 kubelet[2927]: E0813 00:34:24.086789 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:24.086823 kubelet[2927]: W0813 00:34:24.086823 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:24.086933 kubelet[2927]: E0813 00:34:24.086838 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:24.957836 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1337208357.mount: Deactivated successfully. Aug 13 00:34:25.789132 kubelet[2927]: E0813 00:34:25.789093 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-24x4b" podUID="13ac1461-2756-4b70-ab78-2c6b6a45f530" Aug 13 00:34:25.799334 containerd[1618]: time="2025-08-13T00:34:25.799223342Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:25.808727 containerd[1618]: time="2025-08-13T00:34:25.808699114Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 13 00:34:25.815817 containerd[1618]: time="2025-08-13T00:34:25.815761748Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:25.820130 containerd[1618]: time="2025-08-13T00:34:25.820096054Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:25.820678 containerd[1618]: time="2025-08-13T00:34:25.820503656Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.184301947s" Aug 13 00:34:25.820678 containerd[1618]: time="2025-08-13T00:34:25.820526268Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 13 00:34:25.821192 containerd[1618]: time="2025-08-13T00:34:25.821175966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 13 00:34:25.837228 containerd[1618]: time="2025-08-13T00:34:25.837210049Z" level=info msg="CreateContainer within sandbox \"51a2249139438d59d96630ca8815dee2fb58632d29a730c91b126ef44e7f58e7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 13 00:34:25.879677 containerd[1618]: time="2025-08-13T00:34:25.879604719Z" level=info msg="Container f738042ca1f90acf6487b487c3f671d00da9f08f26f30e94ed678ef9c934ab77: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:25.897015 containerd[1618]: time="2025-08-13T00:34:25.896983198Z" level=info msg="CreateContainer within sandbox \"51a2249139438d59d96630ca8815dee2fb58632d29a730c91b126ef44e7f58e7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f738042ca1f90acf6487b487c3f671d00da9f08f26f30e94ed678ef9c934ab77\"" Aug 13 00:34:25.897541 containerd[1618]: time="2025-08-13T00:34:25.897477948Z" level=info msg="StartContainer for \"f738042ca1f90acf6487b487c3f671d00da9f08f26f30e94ed678ef9c934ab77\"" Aug 13 00:34:25.898547 containerd[1618]: time="2025-08-13T00:34:25.898505258Z" level=info msg="connecting to shim f738042ca1f90acf6487b487c3f671d00da9f08f26f30e94ed678ef9c934ab77" address="unix:///run/containerd/s/4ae91ee396fcaca77cacd3e33bfcc1f8b2da06d9d1341f3c2c5b9cbc79ea9802" protocol=ttrpc version=3 Aug 13 00:34:25.918650 systemd[1]: Started cri-containerd-f738042ca1f90acf6487b487c3f671d00da9f08f26f30e94ed678ef9c934ab77.scope - libcontainer container f738042ca1f90acf6487b487c3f671d00da9f08f26f30e94ed678ef9c934ab77. Aug 13 00:34:25.972005 containerd[1618]: time="2025-08-13T00:34:25.971950292Z" level=info msg="StartContainer for \"f738042ca1f90acf6487b487c3f671d00da9f08f26f30e94ed678ef9c934ab77\" returns successfully" Aug 13 00:34:26.890159 kubelet[2927]: E0813 00:34:26.890139 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.890159 kubelet[2927]: W0813 00:34:26.890155 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.890434 kubelet[2927]: E0813 00:34:26.890177 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.890434 kubelet[2927]: E0813 00:34:26.890313 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.890434 kubelet[2927]: W0813 00:34:26.890319 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.890434 kubelet[2927]: E0813 00:34:26.890327 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.890434 kubelet[2927]: E0813 00:34:26.890421 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.890434 kubelet[2927]: W0813 00:34:26.890428 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.893287 kubelet[2927]: E0813 00:34:26.890435 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.893287 kubelet[2927]: E0813 00:34:26.890550 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.893287 kubelet[2927]: W0813 00:34:26.890555 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.893287 kubelet[2927]: E0813 00:34:26.890613 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.893287 kubelet[2927]: E0813 00:34:26.890728 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.893287 kubelet[2927]: W0813 00:34:26.890735 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.893287 kubelet[2927]: E0813 00:34:26.890742 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.893287 kubelet[2927]: E0813 00:34:26.890866 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.893287 kubelet[2927]: W0813 00:34:26.890872 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.893287 kubelet[2927]: E0813 00:34:26.890879 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897042 kubelet[2927]: E0813 00:34:26.890975 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897042 kubelet[2927]: W0813 00:34:26.890981 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897042 kubelet[2927]: E0813 00:34:26.890988 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897042 kubelet[2927]: E0813 00:34:26.891095 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897042 kubelet[2927]: W0813 00:34:26.891112 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897042 kubelet[2927]: E0813 00:34:26.891121 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897042 kubelet[2927]: E0813 00:34:26.891220 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897042 kubelet[2927]: W0813 00:34:26.891227 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897042 kubelet[2927]: E0813 00:34:26.891234 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897042 kubelet[2927]: E0813 00:34:26.891359 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897291 kubelet[2927]: W0813 00:34:26.891365 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897291 kubelet[2927]: E0813 00:34:26.891371 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897291 kubelet[2927]: E0813 00:34:26.891478 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897291 kubelet[2927]: W0813 00:34:26.891498 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897291 kubelet[2927]: E0813 00:34:26.891507 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897291 kubelet[2927]: E0813 00:34:26.891645 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897291 kubelet[2927]: W0813 00:34:26.891651 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897291 kubelet[2927]: E0813 00:34:26.891658 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897291 kubelet[2927]: E0813 00:34:26.891775 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897291 kubelet[2927]: W0813 00:34:26.891796 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897541 kubelet[2927]: E0813 00:34:26.891804 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897541 kubelet[2927]: E0813 00:34:26.891904 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897541 kubelet[2927]: W0813 00:34:26.891910 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897541 kubelet[2927]: E0813 00:34:26.891917 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897541 kubelet[2927]: E0813 00:34:26.892036 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897541 kubelet[2927]: W0813 00:34:26.892042 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897541 kubelet[2927]: E0813 00:34:26.892048 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897541 kubelet[2927]: E0813 00:34:26.896212 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897541 kubelet[2927]: W0813 00:34:26.896218 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897541 kubelet[2927]: E0813 00:34:26.896225 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897911 kubelet[2927]: E0813 00:34:26.896312 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897911 kubelet[2927]: W0813 00:34:26.896318 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897911 kubelet[2927]: E0813 00:34:26.896328 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897911 kubelet[2927]: E0813 00:34:26.896442 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897911 kubelet[2927]: W0813 00:34:26.896448 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897911 kubelet[2927]: E0813 00:34:26.896453 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897911 kubelet[2927]: E0813 00:34:26.896560 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.897911 kubelet[2927]: W0813 00:34:26.896565 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.897911 kubelet[2927]: E0813 00:34:26.896582 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.897911 kubelet[2927]: E0813 00:34:26.896662 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.898191 kubelet[2927]: W0813 00:34:26.896666 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.898191 kubelet[2927]: E0813 00:34:26.896673 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.898191 kubelet[2927]: E0813 00:34:26.896742 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.898191 kubelet[2927]: W0813 00:34:26.896747 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.898191 kubelet[2927]: E0813 00:34:26.896754 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.898191 kubelet[2927]: E0813 00:34:26.896834 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.898191 kubelet[2927]: W0813 00:34:26.896838 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.898191 kubelet[2927]: E0813 00:34:26.896845 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.898191 kubelet[2927]: E0813 00:34:26.896920 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.898191 kubelet[2927]: W0813 00:34:26.896925 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902083 kubelet[2927]: E0813 00:34:26.896930 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902083 kubelet[2927]: E0813 00:34:26.896993 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902083 kubelet[2927]: W0813 00:34:26.896997 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902083 kubelet[2927]: E0813 00:34:26.897003 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902083 kubelet[2927]: E0813 00:34:26.897061 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902083 kubelet[2927]: W0813 00:34:26.897066 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902083 kubelet[2927]: E0813 00:34:26.897071 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902083 kubelet[2927]: E0813 00:34:26.897143 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902083 kubelet[2927]: W0813 00:34:26.897147 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902083 kubelet[2927]: E0813 00:34:26.897152 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902339 kubelet[2927]: E0813 00:34:26.897356 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902339 kubelet[2927]: W0813 00:34:26.897361 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902339 kubelet[2927]: E0813 00:34:26.897366 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902339 kubelet[2927]: E0813 00:34:26.897433 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902339 kubelet[2927]: W0813 00:34:26.897437 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902339 kubelet[2927]: E0813 00:34:26.897443 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902339 kubelet[2927]: E0813 00:34:26.897504 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902339 kubelet[2927]: W0813 00:34:26.897509 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902339 kubelet[2927]: E0813 00:34:26.897513 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902339 kubelet[2927]: E0813 00:34:26.897585 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902634 kubelet[2927]: W0813 00:34:26.897590 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902634 kubelet[2927]: E0813 00:34:26.897594 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902634 kubelet[2927]: E0813 00:34:26.897686 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902634 kubelet[2927]: W0813 00:34:26.897691 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902634 kubelet[2927]: E0813 00:34:26.897696 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902634 kubelet[2927]: E0813 00:34:26.897881 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902634 kubelet[2927]: W0813 00:34:26.897885 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902634 kubelet[2927]: E0813 00:34:26.897892 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:26.902634 kubelet[2927]: E0813 00:34:26.897981 2927 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:26.902634 kubelet[2927]: W0813 00:34:26.897985 2927 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:26.902902 kubelet[2927]: E0813 00:34:26.897990 2927 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:27.144148 containerd[1618]: time="2025-08-13T00:34:27.144076995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:27.145466 containerd[1618]: time="2025-08-13T00:34:27.145105630Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 13 00:34:27.145552 containerd[1618]: time="2025-08-13T00:34:27.145518391Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:27.146981 containerd[1618]: time="2025-08-13T00:34:27.146960604Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:27.147349 containerd[1618]: time="2025-08-13T00:34:27.147332506Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.32613851s" Aug 13 00:34:27.147382 containerd[1618]: time="2025-08-13T00:34:27.147350558Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 13 00:34:27.149076 containerd[1618]: time="2025-08-13T00:34:27.149056923Z" level=info msg="CreateContainer within sandbox \"7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 13 00:34:27.158733 containerd[1618]: time="2025-08-13T00:34:27.158496676Z" level=info msg="Container 6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:27.183715 containerd[1618]: time="2025-08-13T00:34:27.183669191Z" level=info msg="CreateContainer within sandbox \"7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b\"" Aug 13 00:34:27.184106 containerd[1618]: time="2025-08-13T00:34:27.184047902Z" level=info msg="StartContainer for \"6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b\"" Aug 13 00:34:27.185475 containerd[1618]: time="2025-08-13T00:34:27.185415760Z" level=info msg="connecting to shim 6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b" address="unix:///run/containerd/s/284d7c9448f8be7511d753ed008e900f34e342f3f010c834a8fa16cedf5622cd" protocol=ttrpc version=3 Aug 13 00:34:27.210708 systemd[1]: Started cri-containerd-6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b.scope - libcontainer container 6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b. Aug 13 00:34:27.245226 containerd[1618]: time="2025-08-13T00:34:27.245116401Z" level=info msg="StartContainer for \"6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b\" returns successfully" Aug 13 00:34:27.249863 systemd[1]: cri-containerd-6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b.scope: Deactivated successfully. Aug 13 00:34:27.275611 containerd[1618]: time="2025-08-13T00:34:27.275459416Z" level=info msg="received exit event container_id:\"6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b\" id:\"6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b\" pid:3593 exited_at:{seconds:1755045267 nanos:252768622}" Aug 13 00:34:27.289070 containerd[1618]: time="2025-08-13T00:34:27.286684955Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b\" id:\"6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b\" pid:3593 exited_at:{seconds:1755045267 nanos:252768622}" Aug 13 00:34:27.302029 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6e6ebf09c3fb84e77807b15bd50e69df6f6a767a1c7044fa07275561d56a6f0b-rootfs.mount: Deactivated successfully. Aug 13 00:34:27.794022 kubelet[2927]: E0813 00:34:27.793473 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-24x4b" podUID="13ac1461-2756-4b70-ab78-2c6b6a45f530" Aug 13 00:34:27.843730 kubelet[2927]: I0813 00:34:27.843706 2927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:27.844949 containerd[1618]: time="2025-08-13T00:34:27.844827829Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 13 00:34:27.855797 kubelet[2927]: I0813 00:34:27.855167 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b75c7566b-92bpq" podStartSLOduration=2.669883771 podStartE2EDuration="4.855155335s" podCreationTimestamp="2025-08-13 00:34:23 +0000 UTC" firstStartedPulling="2025-08-13 00:34:23.635821417 +0000 UTC m=+16.053097445" lastFinishedPulling="2025-08-13 00:34:25.82109298 +0000 UTC m=+18.238369009" observedRunningTime="2025-08-13 00:34:26.869366837 +0000 UTC m=+19.286642875" watchObservedRunningTime="2025-08-13 00:34:27.855155335 +0000 UTC m=+20.272431368" Aug 13 00:34:29.788798 kubelet[2927]: E0813 00:34:29.788637 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-24x4b" podUID="13ac1461-2756-4b70-ab78-2c6b6a45f530" Aug 13 00:34:31.546175 containerd[1618]: time="2025-08-13T00:34:31.546131972Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:31.546752 containerd[1618]: time="2025-08-13T00:34:31.546695167Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 13 00:34:31.547371 containerd[1618]: time="2025-08-13T00:34:31.547030879Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:31.548213 containerd[1618]: time="2025-08-13T00:34:31.548190951Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:31.548665 containerd[1618]: time="2025-08-13T00:34:31.548580852Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 3.703049728s" Aug 13 00:34:31.548665 containerd[1618]: time="2025-08-13T00:34:31.548604068Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 13 00:34:31.551376 containerd[1618]: time="2025-08-13T00:34:31.550892088Z" level=info msg="CreateContainer within sandbox \"7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 13 00:34:31.557546 containerd[1618]: time="2025-08-13T00:34:31.557227747Z" level=info msg="Container 9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:31.598408 containerd[1618]: time="2025-08-13T00:34:31.598383457Z" level=info msg="CreateContainer within sandbox \"7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809\"" Aug 13 00:34:31.599153 containerd[1618]: time="2025-08-13T00:34:31.599109812Z" level=info msg="StartContainer for \"9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809\"" Aug 13 00:34:31.603825 containerd[1618]: time="2025-08-13T00:34:31.600215258Z" level=info msg="connecting to shim 9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809" address="unix:///run/containerd/s/284d7c9448f8be7511d753ed008e900f34e342f3f010c834a8fa16cedf5622cd" protocol=ttrpc version=3 Aug 13 00:34:31.620667 systemd[1]: Started cri-containerd-9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809.scope - libcontainer container 9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809. Aug 13 00:34:31.653667 containerd[1618]: time="2025-08-13T00:34:31.653642847Z" level=info msg="StartContainer for \"9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809\" returns successfully" Aug 13 00:34:31.788461 kubelet[2927]: E0813 00:34:31.788429 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-24x4b" podUID="13ac1461-2756-4b70-ab78-2c6b6a45f530" Aug 13 00:34:33.789313 kubelet[2927]: E0813 00:34:33.788759 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-24x4b" podUID="13ac1461-2756-4b70-ab78-2c6b6a45f530" Aug 13 00:34:34.050927 systemd[1]: cri-containerd-9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809.scope: Deactivated successfully. Aug 13 00:34:34.051121 systemd[1]: cri-containerd-9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809.scope: Consumed 307ms CPU time, 162.5M memory peak, 12K read from disk, 171.2M written to disk. Aug 13 00:34:34.087824 containerd[1618]: time="2025-08-13T00:34:34.087626319Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809\" id:\"9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809\" pid:3653 exited_at:{seconds:1755045274 nanos:77287235}" Aug 13 00:34:34.087824 containerd[1618]: time="2025-08-13T00:34:34.087735907Z" level=info msg="received exit event container_id:\"9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809\" id:\"9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809\" pid:3653 exited_at:{seconds:1755045274 nanos:77287235}" Aug 13 00:34:34.112082 kubelet[2927]: I0813 00:34:34.112002 2927 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Aug 13 00:34:34.146786 kubelet[2927]: I0813 00:34:34.146762 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kgj7\" (UniqueName: \"kubernetes.io/projected/d50d2d1d-4a7e-454b-acc1-521b442ea412-kube-api-access-7kgj7\") pod \"calico-kube-controllers-7cd45fb679-bvd9c\" (UID: \"d50d2d1d-4a7e-454b-acc1-521b442ea412\") " pod="calico-system/calico-kube-controllers-7cd45fb679-bvd9c" Aug 13 00:34:34.146786 kubelet[2927]: I0813 00:34:34.146786 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d50d2d1d-4a7e-454b-acc1-521b442ea412-tigera-ca-bundle\") pod \"calico-kube-controllers-7cd45fb679-bvd9c\" (UID: \"d50d2d1d-4a7e-454b-acc1-521b442ea412\") " pod="calico-system/calico-kube-controllers-7cd45fb679-bvd9c" Aug 13 00:34:34.160772 systemd[1]: Created slice kubepods-besteffort-podd50d2d1d_4a7e_454b_acc1_521b442ea412.slice - libcontainer container kubepods-besteffort-podd50d2d1d_4a7e_454b_acc1_521b442ea412.slice. Aug 13 00:34:34.170258 systemd[1]: Created slice kubepods-besteffort-pod03280d1f_7157_4e73_820c_3c79157bccae.slice - libcontainer container kubepods-besteffort-pod03280d1f_7157_4e73_820c_3c79157bccae.slice. Aug 13 00:34:34.180838 systemd[1]: Created slice kubepods-besteffort-podf9cd0b21_5cc1_4092_9e3a_778a5727c93b.slice - libcontainer container kubepods-besteffort-podf9cd0b21_5cc1_4092_9e3a_778a5727c93b.slice. Aug 13 00:34:34.188058 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9e1dc1703b902d837209158cc67a1d5261f1808f1a08bc8dbd68da90c5f3b809-rootfs.mount: Deactivated successfully. Aug 13 00:34:34.198178 systemd[1]: Created slice kubepods-besteffort-pod895d523e_e692_449e_960d_7285bbad6a88.slice - libcontainer container kubepods-besteffort-pod895d523e_e692_449e_960d_7285bbad6a88.slice. Aug 13 00:34:34.204558 systemd[1]: Created slice kubepods-burstable-podfe7d68d1_75a2_45b8_b95c_24db42f5e458.slice - libcontainer container kubepods-burstable-podfe7d68d1_75a2_45b8_b95c_24db42f5e458.slice. Aug 13 00:34:34.210760 systemd[1]: Created slice kubepods-besteffort-pod597f337a_5f28_41ac_87f7_1c6ef1c833a4.slice - libcontainer container kubepods-besteffort-pod597f337a_5f28_41ac_87f7_1c6ef1c833a4.slice. Aug 13 00:34:34.216118 systemd[1]: Created slice kubepods-burstable-pod1041bee9_c09a_4ab4_b0d0_aa6d4e1e0222.slice - libcontainer container kubepods-burstable-pod1041bee9_c09a_4ab4_b0d0_aa6d4e1e0222.slice. Aug 13 00:34:34.349453 kubelet[2927]: I0813 00:34:34.349071 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nwc9\" (UniqueName: \"kubernetes.io/projected/1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222-kube-api-access-2nwc9\") pod \"coredns-7c65d6cfc9-hz27z\" (UID: \"1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222\") " pod="kube-system/coredns-7c65d6cfc9-hz27z" Aug 13 00:34:34.349453 kubelet[2927]: I0813 00:34:34.349129 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03280d1f-7157-4e73-820c-3c79157bccae-calico-apiserver-certs\") pod \"calico-apiserver-b9db99d59-fb8ql\" (UID: \"03280d1f-7157-4e73-820c-3c79157bccae\") " pod="calico-apiserver/calico-apiserver-b9db99d59-fb8ql" Aug 13 00:34:34.349453 kubelet[2927]: I0813 00:34:34.349151 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9cd0b21-5cc1-4092-9e3a-778a5727c93b-config\") pod \"goldmane-58fd7646b9-c4ngc\" (UID: \"f9cd0b21-5cc1-4092-9e3a-778a5727c93b\") " pod="calico-system/goldmane-58fd7646b9-c4ngc" Aug 13 00:34:34.349453 kubelet[2927]: I0813 00:34:34.349171 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9cd0b21-5cc1-4092-9e3a-778a5727c93b-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-c4ngc\" (UID: \"f9cd0b21-5cc1-4092-9e3a-778a5727c93b\") " pod="calico-system/goldmane-58fd7646b9-c4ngc" Aug 13 00:34:34.349453 kubelet[2927]: I0813 00:34:34.349194 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf9j2\" (UniqueName: \"kubernetes.io/projected/fe7d68d1-75a2-45b8-b95c-24db42f5e458-kube-api-access-tf9j2\") pod \"coredns-7c65d6cfc9-66m68\" (UID: \"fe7d68d1-75a2-45b8-b95c-24db42f5e458\") " pod="kube-system/coredns-7c65d6cfc9-66m68" Aug 13 00:34:34.349846 kubelet[2927]: I0813 00:34:34.349211 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnsfr\" (UniqueName: \"kubernetes.io/projected/597f337a-5f28-41ac-87f7-1c6ef1c833a4-kube-api-access-cnsfr\") pod \"whisker-586ff4c9c9-6wt9x\" (UID: \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\") " pod="calico-system/whisker-586ff4c9c9-6wt9x" Aug 13 00:34:34.349846 kubelet[2927]: I0813 00:34:34.349231 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe7d68d1-75a2-45b8-b95c-24db42f5e458-config-volume\") pod \"coredns-7c65d6cfc9-66m68\" (UID: \"fe7d68d1-75a2-45b8-b95c-24db42f5e458\") " pod="kube-system/coredns-7c65d6cfc9-66m68" Aug 13 00:34:34.349846 kubelet[2927]: I0813 00:34:34.349248 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/895d523e-e692-449e-960d-7285bbad6a88-calico-apiserver-certs\") pod \"calico-apiserver-b9db99d59-mnjp4\" (UID: \"895d523e-e692-449e-960d-7285bbad6a88\") " pod="calico-apiserver/calico-apiserver-b9db99d59-mnjp4" Aug 13 00:34:34.349846 kubelet[2927]: I0813 00:34:34.349269 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxmdc\" (UniqueName: \"kubernetes.io/projected/03280d1f-7157-4e73-820c-3c79157bccae-kube-api-access-lxmdc\") pod \"calico-apiserver-b9db99d59-fb8ql\" (UID: \"03280d1f-7157-4e73-820c-3c79157bccae\") " pod="calico-apiserver/calico-apiserver-b9db99d59-fb8ql" Aug 13 00:34:34.349846 kubelet[2927]: I0813 00:34:34.349286 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f9cd0b21-5cc1-4092-9e3a-778a5727c93b-goldmane-key-pair\") pod \"goldmane-58fd7646b9-c4ngc\" (UID: \"f9cd0b21-5cc1-4092-9e3a-778a5727c93b\") " pod="calico-system/goldmane-58fd7646b9-c4ngc" Aug 13 00:34:34.350015 kubelet[2927]: I0813 00:34:34.349304 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/597f337a-5f28-41ac-87f7-1c6ef1c833a4-whisker-backend-key-pair\") pod \"whisker-586ff4c9c9-6wt9x\" (UID: \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\") " pod="calico-system/whisker-586ff4c9c9-6wt9x" Aug 13 00:34:34.350015 kubelet[2927]: I0813 00:34:34.349321 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222-config-volume\") pod \"coredns-7c65d6cfc9-hz27z\" (UID: \"1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222\") " pod="kube-system/coredns-7c65d6cfc9-hz27z" Aug 13 00:34:34.350015 kubelet[2927]: I0813 00:34:34.349342 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwbbn\" (UniqueName: \"kubernetes.io/projected/f9cd0b21-5cc1-4092-9e3a-778a5727c93b-kube-api-access-wwbbn\") pod \"goldmane-58fd7646b9-c4ngc\" (UID: \"f9cd0b21-5cc1-4092-9e3a-778a5727c93b\") " pod="calico-system/goldmane-58fd7646b9-c4ngc" Aug 13 00:34:34.350015 kubelet[2927]: I0813 00:34:34.349358 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/597f337a-5f28-41ac-87f7-1c6ef1c833a4-whisker-ca-bundle\") pod \"whisker-586ff4c9c9-6wt9x\" (UID: \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\") " pod="calico-system/whisker-586ff4c9c9-6wt9x" Aug 13 00:34:34.350015 kubelet[2927]: I0813 00:34:34.349379 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67fc7\" (UniqueName: \"kubernetes.io/projected/895d523e-e692-449e-960d-7285bbad6a88-kube-api-access-67fc7\") pod \"calico-apiserver-b9db99d59-mnjp4\" (UID: \"895d523e-e692-449e-960d-7285bbad6a88\") " pod="calico-apiserver/calico-apiserver-b9db99d59-mnjp4" Aug 13 00:34:34.498686 containerd[1618]: time="2025-08-13T00:34:34.498614043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45fb679-bvd9c,Uid:d50d2d1d-4a7e-454b-acc1-521b442ea412,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:34.502714 containerd[1618]: time="2025-08-13T00:34:34.502420256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9db99d59-mnjp4,Uid:895d523e-e692-449e-960d-7285bbad6a88,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:34:34.516264 containerd[1618]: time="2025-08-13T00:34:34.516190469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-586ff4c9c9-6wt9x,Uid:597f337a-5f28-41ac-87f7-1c6ef1c833a4,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:34.517217 containerd[1618]: time="2025-08-13T00:34:34.517053474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-66m68,Uid:fe7d68d1-75a2-45b8-b95c-24db42f5e458,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:34.536235 containerd[1618]: time="2025-08-13T00:34:34.535947265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hz27z,Uid:1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:34.783484 containerd[1618]: time="2025-08-13T00:34:34.783451287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9db99d59-fb8ql,Uid:03280d1f-7157-4e73-820c-3c79157bccae,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:34:34.794574 containerd[1618]: time="2025-08-13T00:34:34.794307082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-c4ngc,Uid:f9cd0b21-5cc1-4092-9e3a-778a5727c93b,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:34.825243 containerd[1618]: time="2025-08-13T00:34:34.825203503Z" level=error msg="Failed to destroy network for sandbox \"99548e104cb4e34a281deb22342118dabd5ce86f13613011aed451f1be466a52\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.841501 containerd[1618]: time="2025-08-13T00:34:34.827007434Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-586ff4c9c9-6wt9x,Uid:597f337a-5f28-41ac-87f7-1c6ef1c833a4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"99548e104cb4e34a281deb22342118dabd5ce86f13613011aed451f1be466a52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.841775 containerd[1618]: time="2025-08-13T00:34:34.829925888Z" level=error msg="Failed to destroy network for sandbox \"9a2824e7ce08497727c2f873c7c41c46e24835215caf6cefe5a5fd397289e179\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.842007 containerd[1618]: time="2025-08-13T00:34:34.839506981Z" level=error msg="Failed to destroy network for sandbox \"3afb2bcca8e524ccff7a28085bd1401d452b8e0a09225347965996cda6647cc8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.843208 containerd[1618]: time="2025-08-13T00:34:34.839551719Z" level=error msg="Failed to destroy network for sandbox \"e4a8e3506e66a136539fd3cd5d4519dd96177ce30da772d0205662194d05d52b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.843331 containerd[1618]: time="2025-08-13T00:34:34.840045122Z" level=error msg="Failed to destroy network for sandbox \"f74e7d9f7913b008d21789d87355549082766dcf2052db69cddacf44f596c088\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.844833 containerd[1618]: time="2025-08-13T00:34:34.844713461Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hz27z,Uid:1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a2824e7ce08497727c2f873c7c41c46e24835215caf6cefe5a5fd397289e179\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.845017 containerd[1618]: time="2025-08-13T00:34:34.845003233Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9db99d59-mnjp4,Uid:895d523e-e692-449e-960d-7285bbad6a88,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3afb2bcca8e524ccff7a28085bd1401d452b8e0a09225347965996cda6647cc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.845284 containerd[1618]: time="2025-08-13T00:34:34.845270703Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45fb679-bvd9c,Uid:d50d2d1d-4a7e-454b-acc1-521b442ea412,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4a8e3506e66a136539fd3cd5d4519dd96177ce30da772d0205662194d05d52b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.845599 containerd[1618]: time="2025-08-13T00:34:34.845580133Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-66m68,Uid:fe7d68d1-75a2-45b8-b95c-24db42f5e458,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f74e7d9f7913b008d21789d87355549082766dcf2052db69cddacf44f596c088\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.849901 kubelet[2927]: E0813 00:34:34.849738 2927 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a2824e7ce08497727c2f873c7c41c46e24835215caf6cefe5a5fd397289e179\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.849901 kubelet[2927]: E0813 00:34:34.849800 2927 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a2824e7ce08497727c2f873c7c41c46e24835215caf6cefe5a5fd397289e179\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hz27z" Aug 13 00:34:34.849901 kubelet[2927]: E0813 00:34:34.849815 2927 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a2824e7ce08497727c2f873c7c41c46e24835215caf6cefe5a5fd397289e179\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hz27z" Aug 13 00:34:34.850926 kubelet[2927]: E0813 00:34:34.850171 2927 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f74e7d9f7913b008d21789d87355549082766dcf2052db69cddacf44f596c088\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.850926 kubelet[2927]: E0813 00:34:34.850200 2927 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f74e7d9f7913b008d21789d87355549082766dcf2052db69cddacf44f596c088\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-66m68" Aug 13 00:34:34.852628 kubelet[2927]: E0813 00:34:34.850213 2927 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f74e7d9f7913b008d21789d87355549082766dcf2052db69cddacf44f596c088\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-66m68" Aug 13 00:34:34.853235 kubelet[2927]: E0813 00:34:34.853215 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-66m68_kube-system(fe7d68d1-75a2-45b8-b95c-24db42f5e458)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-66m68_kube-system(fe7d68d1-75a2-45b8-b95c-24db42f5e458)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f74e7d9f7913b008d21789d87355549082766dcf2052db69cddacf44f596c088\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-66m68" podUID="fe7d68d1-75a2-45b8-b95c-24db42f5e458" Aug 13 00:34:34.853417 kubelet[2927]: E0813 00:34:34.853404 2927 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3afb2bcca8e524ccff7a28085bd1401d452b8e0a09225347965996cda6647cc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.853475 kubelet[2927]: E0813 00:34:34.853465 2927 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3afb2bcca8e524ccff7a28085bd1401d452b8e0a09225347965996cda6647cc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b9db99d59-mnjp4" Aug 13 00:34:34.853734 kubelet[2927]: E0813 00:34:34.853721 2927 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3afb2bcca8e524ccff7a28085bd1401d452b8e0a09225347965996cda6647cc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b9db99d59-mnjp4" Aug 13 00:34:34.853815 kubelet[2927]: E0813 00:34:34.853804 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b9db99d59-mnjp4_calico-apiserver(895d523e-e692-449e-960d-7285bbad6a88)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b9db99d59-mnjp4_calico-apiserver(895d523e-e692-449e-960d-7285bbad6a88)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3afb2bcca8e524ccff7a28085bd1401d452b8e0a09225347965996cda6647cc8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b9db99d59-mnjp4" podUID="895d523e-e692-449e-960d-7285bbad6a88" Aug 13 00:34:34.853881 kubelet[2927]: E0813 00:34:34.853871 2927 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4a8e3506e66a136539fd3cd5d4519dd96177ce30da772d0205662194d05d52b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.853923 kubelet[2927]: E0813 00:34:34.853915 2927 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4a8e3506e66a136539fd3cd5d4519dd96177ce30da772d0205662194d05d52b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45fb679-bvd9c" Aug 13 00:34:34.853964 kubelet[2927]: E0813 00:34:34.853956 2927 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4a8e3506e66a136539fd3cd5d4519dd96177ce30da772d0205662194d05d52b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45fb679-bvd9c" Aug 13 00:34:34.855127 kubelet[2927]: E0813 00:34:34.853999 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cd45fb679-bvd9c_calico-system(d50d2d1d-4a7e-454b-acc1-521b442ea412)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cd45fb679-bvd9c_calico-system(d50d2d1d-4a7e-454b-acc1-521b442ea412)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e4a8e3506e66a136539fd3cd5d4519dd96177ce30da772d0205662194d05d52b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cd45fb679-bvd9c" podUID="d50d2d1d-4a7e-454b-acc1-521b442ea412" Aug 13 00:34:34.855127 kubelet[2927]: E0813 00:34:34.853529 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-hz27z_kube-system(1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-hz27z_kube-system(1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9a2824e7ce08497727c2f873c7c41c46e24835215caf6cefe5a5fd397289e179\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-hz27z" podUID="1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222" Aug 13 00:34:34.855127 kubelet[2927]: E0813 00:34:34.853609 2927 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99548e104cb4e34a281deb22342118dabd5ce86f13613011aed451f1be466a52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.855254 kubelet[2927]: E0813 00:34:34.855076 2927 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99548e104cb4e34a281deb22342118dabd5ce86f13613011aed451f1be466a52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-586ff4c9c9-6wt9x" Aug 13 00:34:34.855254 kubelet[2927]: E0813 00:34:34.855087 2927 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99548e104cb4e34a281deb22342118dabd5ce86f13613011aed451f1be466a52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-586ff4c9c9-6wt9x" Aug 13 00:34:34.855254 kubelet[2927]: E0813 00:34:34.855104 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-586ff4c9c9-6wt9x_calico-system(597f337a-5f28-41ac-87f7-1c6ef1c833a4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-586ff4c9c9-6wt9x_calico-system(597f337a-5f28-41ac-87f7-1c6ef1c833a4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"99548e104cb4e34a281deb22342118dabd5ce86f13613011aed451f1be466a52\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-586ff4c9c9-6wt9x" podUID="597f337a-5f28-41ac-87f7-1c6ef1c833a4" Aug 13 00:34:34.868808 containerd[1618]: time="2025-08-13T00:34:34.868385988Z" level=error msg="Failed to destroy network for sandbox \"9dcffd8cfefadd0034cc3467d1fc0ac49d67d97a67d1e5a7be85fa51f66f047b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.873202 containerd[1618]: time="2025-08-13T00:34:34.873167285Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9db99d59-fb8ql,Uid:03280d1f-7157-4e73-820c-3c79157bccae,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dcffd8cfefadd0034cc3467d1fc0ac49d67d97a67d1e5a7be85fa51f66f047b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.873500 kubelet[2927]: E0813 00:34:34.873375 2927 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dcffd8cfefadd0034cc3467d1fc0ac49d67d97a67d1e5a7be85fa51f66f047b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.873500 kubelet[2927]: E0813 00:34:34.873416 2927 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dcffd8cfefadd0034cc3467d1fc0ac49d67d97a67d1e5a7be85fa51f66f047b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b9db99d59-fb8ql" Aug 13 00:34:34.873500 kubelet[2927]: E0813 00:34:34.873434 2927 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dcffd8cfefadd0034cc3467d1fc0ac49d67d97a67d1e5a7be85fa51f66f047b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b9db99d59-fb8ql" Aug 13 00:34:34.873595 kubelet[2927]: E0813 00:34:34.873458 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b9db99d59-fb8ql_calico-apiserver(03280d1f-7157-4e73-820c-3c79157bccae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b9db99d59-fb8ql_calico-apiserver(03280d1f-7157-4e73-820c-3c79157bccae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9dcffd8cfefadd0034cc3467d1fc0ac49d67d97a67d1e5a7be85fa51f66f047b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b9db99d59-fb8ql" podUID="03280d1f-7157-4e73-820c-3c79157bccae" Aug 13 00:34:34.892046 containerd[1618]: time="2025-08-13T00:34:34.891972603Z" level=error msg="Failed to destroy network for sandbox \"d659276185ad864fd8efea88e3de443d8ccbba3f22c88a4eadaa7c7014b60a35\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.894396 containerd[1618]: time="2025-08-13T00:34:34.894369009Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-c4ngc,Uid:f9cd0b21-5cc1-4092-9e3a-778a5727c93b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d659276185ad864fd8efea88e3de443d8ccbba3f22c88a4eadaa7c7014b60a35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.894668 kubelet[2927]: E0813 00:34:34.894633 2927 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d659276185ad864fd8efea88e3de443d8ccbba3f22c88a4eadaa7c7014b60a35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:34.894954 kubelet[2927]: E0813 00:34:34.894684 2927 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d659276185ad864fd8efea88e3de443d8ccbba3f22c88a4eadaa7c7014b60a35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-c4ngc" Aug 13 00:34:34.894954 kubelet[2927]: E0813 00:34:34.894731 2927 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d659276185ad864fd8efea88e3de443d8ccbba3f22c88a4eadaa7c7014b60a35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-c4ngc" Aug 13 00:34:34.894954 kubelet[2927]: E0813 00:34:34.894845 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-c4ngc_calico-system(f9cd0b21-5cc1-4092-9e3a-778a5727c93b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-c4ngc_calico-system(f9cd0b21-5cc1-4092-9e3a-778a5727c93b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d659276185ad864fd8efea88e3de443d8ccbba3f22c88a4eadaa7c7014b60a35\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-c4ngc" podUID="f9cd0b21-5cc1-4092-9e3a-778a5727c93b" Aug 13 00:34:34.981111 containerd[1618]: time="2025-08-13T00:34:34.981075561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 13 00:34:35.793492 systemd[1]: Created slice kubepods-besteffort-pod13ac1461_2756_4b70_ab78_2c6b6a45f530.slice - libcontainer container kubepods-besteffort-pod13ac1461_2756_4b70_ab78_2c6b6a45f530.slice. Aug 13 00:34:35.795427 containerd[1618]: time="2025-08-13T00:34:35.795407070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-24x4b,Uid:13ac1461-2756-4b70-ab78-2c6b6a45f530,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:35.833964 containerd[1618]: time="2025-08-13T00:34:35.833922729Z" level=error msg="Failed to destroy network for sandbox \"cdcf905d8b3df90fa79990961f7feae2251d7530104676b8552be8d9c3dd0164\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:35.834654 containerd[1618]: time="2025-08-13T00:34:35.834628607Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-24x4b,Uid:13ac1461-2756-4b70-ab78-2c6b6a45f530,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdcf905d8b3df90fa79990961f7feae2251d7530104676b8552be8d9c3dd0164\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:35.835737 kubelet[2927]: E0813 00:34:35.835691 2927 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdcf905d8b3df90fa79990961f7feae2251d7530104676b8552be8d9c3dd0164\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:35.835797 kubelet[2927]: E0813 00:34:35.835750 2927 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdcf905d8b3df90fa79990961f7feae2251d7530104676b8552be8d9c3dd0164\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-24x4b" Aug 13 00:34:35.835797 kubelet[2927]: E0813 00:34:35.835766 2927 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdcf905d8b3df90fa79990961f7feae2251d7530104676b8552be8d9c3dd0164\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-24x4b" Aug 13 00:34:35.835880 kubelet[2927]: E0813 00:34:35.835804 2927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-24x4b_calico-system(13ac1461-2756-4b70-ab78-2c6b6a45f530)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-24x4b_calico-system(13ac1461-2756-4b70-ab78-2c6b6a45f530)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cdcf905d8b3df90fa79990961f7feae2251d7530104676b8552be8d9c3dd0164\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-24x4b" podUID="13ac1461-2756-4b70-ab78-2c6b6a45f530" Aug 13 00:34:35.836368 systemd[1]: run-netns-cni\x2ddc4f59ab\x2d08bf\x2d44a9\x2d9a9c\x2d0f6fe667224a.mount: Deactivated successfully. Aug 13 00:34:40.078261 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3753159482.mount: Deactivated successfully. Aug 13 00:34:40.212376 containerd[1618]: time="2025-08-13T00:34:40.203931379Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:40.225149 containerd[1618]: time="2025-08-13T00:34:40.210355560Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 13 00:34:40.238561 containerd[1618]: time="2025-08-13T00:34:40.238513645Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:40.245117 containerd[1618]: time="2025-08-13T00:34:40.245070320Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:40.245871 containerd[1618]: time="2025-08-13T00:34:40.245846528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 5.264055298s" Aug 13 00:34:40.245871 containerd[1618]: time="2025-08-13T00:34:40.245869343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 13 00:34:40.295769 containerd[1618]: time="2025-08-13T00:34:40.295665216Z" level=info msg="CreateContainer within sandbox \"7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 13 00:34:40.568123 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount648221278.mount: Deactivated successfully. Aug 13 00:34:40.568520 containerd[1618]: time="2025-08-13T00:34:40.568393172Z" level=info msg="Container 3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:40.617107 containerd[1618]: time="2025-08-13T00:34:40.617084538Z" level=info msg="CreateContainer within sandbox \"7e28c6c6609deabca70683b421e273ce0e65163031abe2e88ed954828fc7ba3b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd\"" Aug 13 00:34:40.617885 containerd[1618]: time="2025-08-13T00:34:40.617860196Z" level=info msg="StartContainer for \"3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd\"" Aug 13 00:34:40.631061 containerd[1618]: time="2025-08-13T00:34:40.631025172Z" level=info msg="connecting to shim 3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd" address="unix:///run/containerd/s/284d7c9448f8be7511d753ed008e900f34e342f3f010c834a8fa16cedf5622cd" protocol=ttrpc version=3 Aug 13 00:34:40.934690 systemd[1]: Started cri-containerd-3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd.scope - libcontainer container 3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd. Aug 13 00:34:41.052152 containerd[1618]: time="2025-08-13T00:34:41.052084522Z" level=info msg="StartContainer for \"3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd\" returns successfully" Aug 13 00:34:41.453080 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 13 00:34:41.470659 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 13 00:34:42.212572 kubelet[2927]: I0813 00:34:42.212514 2927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:42.718627 kubelet[2927]: I0813 00:34:42.708431 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-c69tb" podStartSLOduration=3.49806797 podStartE2EDuration="19.708416505s" podCreationTimestamp="2025-08-13 00:34:23 +0000 UTC" firstStartedPulling="2025-08-13 00:34:24.050170592 +0000 UTC m=+16.467446623" lastFinishedPulling="2025-08-13 00:34:40.260519129 +0000 UTC m=+32.677795158" observedRunningTime="2025-08-13 00:34:41.153029989 +0000 UTC m=+33.570306027" watchObservedRunningTime="2025-08-13 00:34:42.708416505 +0000 UTC m=+35.125692537" Aug 13 00:34:42.735359 containerd[1618]: time="2025-08-13T00:34:42.735288070Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd\" id:\"e40ecd49151d01319a304d58f1ab0ff0a08b76f2e002a11700ba867fea7f0d11\" pid:3976 exit_status:1 exited_at:{seconds:1755045282 nanos:725708499}" Aug 13 00:34:42.896109 kubelet[2927]: I0813 00:34:42.896085 2927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/597f337a-5f28-41ac-87f7-1c6ef1c833a4-whisker-backend-key-pair\") pod \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\" (UID: \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\") " Aug 13 00:34:42.896310 kubelet[2927]: I0813 00:34:42.896116 2927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnsfr\" (UniqueName: \"kubernetes.io/projected/597f337a-5f28-41ac-87f7-1c6ef1c833a4-kube-api-access-cnsfr\") pod \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\" (UID: \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\") " Aug 13 00:34:42.896310 kubelet[2927]: I0813 00:34:42.896131 2927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/597f337a-5f28-41ac-87f7-1c6ef1c833a4-whisker-ca-bundle\") pod \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\" (UID: \"597f337a-5f28-41ac-87f7-1c6ef1c833a4\") " Aug 13 00:34:42.917012 containerd[1618]: time="2025-08-13T00:34:42.916944977Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd\" id:\"4840a5c14e9f36edaf97074ab62d4c9761177c8a9f46c0e5e14d1895408c377e\" pid:4001 exit_status:1 exited_at:{seconds:1755045282 nanos:916303739}" Aug 13 00:34:42.940170 kubelet[2927]: I0813 00:34:42.940137 2927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597f337a-5f28-41ac-87f7-1c6ef1c833a4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "597f337a-5f28-41ac-87f7-1c6ef1c833a4" (UID: "597f337a-5f28-41ac-87f7-1c6ef1c833a4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Aug 13 00:34:42.942094 systemd[1]: var-lib-kubelet-pods-597f337a\x2d5f28\x2d41ac\x2d87f7\x2d1c6ef1c833a4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 13 00:34:42.944471 systemd[1]: var-lib-kubelet-pods-597f337a\x2d5f28\x2d41ac\x2d87f7\x2d1c6ef1c833a4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dcnsfr.mount: Deactivated successfully. Aug 13 00:34:42.959488 kubelet[2927]: I0813 00:34:42.943593 2927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597f337a-5f28-41ac-87f7-1c6ef1c833a4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "597f337a-5f28-41ac-87f7-1c6ef1c833a4" (UID: "597f337a-5f28-41ac-87f7-1c6ef1c833a4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 13 00:34:42.959488 kubelet[2927]: I0813 00:34:42.944666 2927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/597f337a-5f28-41ac-87f7-1c6ef1c833a4-kube-api-access-cnsfr" (OuterVolumeSpecName: "kube-api-access-cnsfr") pod "597f337a-5f28-41ac-87f7-1c6ef1c833a4" (UID: "597f337a-5f28-41ac-87f7-1c6ef1c833a4"). InnerVolumeSpecName "kube-api-access-cnsfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 13 00:34:42.996741 kubelet[2927]: I0813 00:34:42.996671 2927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnsfr\" (UniqueName: \"kubernetes.io/projected/597f337a-5f28-41ac-87f7-1c6ef1c833a4-kube-api-access-cnsfr\") on node \"localhost\" DevicePath \"\"" Aug 13 00:34:43.001813 kubelet[2927]: I0813 00:34:42.996798 2927 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/597f337a-5f28-41ac-87f7-1c6ef1c833a4-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Aug 13 00:34:43.001813 kubelet[2927]: I0813 00:34:42.996806 2927 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/597f337a-5f28-41ac-87f7-1c6ef1c833a4-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Aug 13 00:34:43.143601 systemd[1]: Removed slice kubepods-besteffort-pod597f337a_5f28_41ac_87f7_1c6ef1c833a4.slice - libcontainer container kubepods-besteffort-pod597f337a_5f28_41ac_87f7_1c6ef1c833a4.slice. Aug 13 00:34:43.270891 systemd[1]: Created slice kubepods-besteffort-podfce250c7_bc05_4407_b188_ad4970930c22.slice - libcontainer container kubepods-besteffort-podfce250c7_bc05_4407_b188_ad4970930c22.slice. Aug 13 00:34:43.399688 kubelet[2927]: I0813 00:34:43.399633 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtgnn\" (UniqueName: \"kubernetes.io/projected/fce250c7-bc05-4407-b188-ad4970930c22-kube-api-access-wtgnn\") pod \"whisker-6f74d66c57-k5627\" (UID: \"fce250c7-bc05-4407-b188-ad4970930c22\") " pod="calico-system/whisker-6f74d66c57-k5627" Aug 13 00:34:43.399688 kubelet[2927]: I0813 00:34:43.399667 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fce250c7-bc05-4407-b188-ad4970930c22-whisker-backend-key-pair\") pod \"whisker-6f74d66c57-k5627\" (UID: \"fce250c7-bc05-4407-b188-ad4970930c22\") " pod="calico-system/whisker-6f74d66c57-k5627" Aug 13 00:34:43.399976 kubelet[2927]: I0813 00:34:43.399708 2927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fce250c7-bc05-4407-b188-ad4970930c22-whisker-ca-bundle\") pod \"whisker-6f74d66c57-k5627\" (UID: \"fce250c7-bc05-4407-b188-ad4970930c22\") " pod="calico-system/whisker-6f74d66c57-k5627" Aug 13 00:34:43.574385 containerd[1618]: time="2025-08-13T00:34:43.574165572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f74d66c57-k5627,Uid:fce250c7-bc05-4407-b188-ad4970930c22,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:43.792560 kubelet[2927]: I0813 00:34:43.792479 2927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="597f337a-5f28-41ac-87f7-1c6ef1c833a4" path="/var/lib/kubelet/pods/597f337a-5f28-41ac-87f7-1c6ef1c833a4/volumes" Aug 13 00:34:43.975943 systemd-networkd[1540]: calib95e26ebddc: Link UP Aug 13 00:34:43.976043 systemd-networkd[1540]: calib95e26ebddc: Gained carrier Aug 13 00:34:43.987337 containerd[1618]: 2025-08-13 00:34:43.608 [INFO][4034] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:43.987337 containerd[1618]: 2025-08-13 00:34:43.648 [INFO][4034] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--6f74d66c57--k5627-eth0 whisker-6f74d66c57- calico-system fce250c7-bc05-4407-b188-ad4970930c22 861 0 2025-08-13 00:34:43 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6f74d66c57 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-6f74d66c57-k5627 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib95e26ebddc [] [] }} ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Namespace="calico-system" Pod="whisker-6f74d66c57-k5627" WorkloadEndpoint="localhost-k8s-whisker--6f74d66c57--k5627-" Aug 13 00:34:43.987337 containerd[1618]: 2025-08-13 00:34:43.649 [INFO][4034] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Namespace="calico-system" Pod="whisker-6f74d66c57-k5627" WorkloadEndpoint="localhost-k8s-whisker--6f74d66c57--k5627-eth0" Aug 13 00:34:43.987337 containerd[1618]: 2025-08-13 00:34:43.921 [INFO][4042] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" HandleID="k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Workload="localhost-k8s-whisker--6f74d66c57--k5627-eth0" Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.924 [INFO][4042] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" HandleID="k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Workload="localhost-k8s-whisker--6f74d66c57--k5627-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103870), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-6f74d66c57-k5627", "timestamp":"2025-08-13 00:34:43.921691243 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.924 [INFO][4042] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.924 [INFO][4042] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.925 [INFO][4042] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.940 [INFO][4042] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" host="localhost" Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.948 [INFO][4042] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.951 [INFO][4042] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.952 [INFO][4042] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.954 [INFO][4042] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:43.988029 containerd[1618]: 2025-08-13 00:34:43.954 [INFO][4042] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" host="localhost" Aug 13 00:34:43.988649 containerd[1618]: 2025-08-13 00:34:43.954 [INFO][4042] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd Aug 13 00:34:43.988649 containerd[1618]: 2025-08-13 00:34:43.957 [INFO][4042] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" host="localhost" Aug 13 00:34:43.988649 containerd[1618]: 2025-08-13 00:34:43.960 [INFO][4042] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" host="localhost" Aug 13 00:34:43.988649 containerd[1618]: 2025-08-13 00:34:43.960 [INFO][4042] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" host="localhost" Aug 13 00:34:43.988649 containerd[1618]: 2025-08-13 00:34:43.960 [INFO][4042] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:43.988649 containerd[1618]: 2025-08-13 00:34:43.960 [INFO][4042] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" HandleID="k8s-pod-network.8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Workload="localhost-k8s-whisker--6f74d66c57--k5627-eth0" Aug 13 00:34:43.989072 containerd[1618]: 2025-08-13 00:34:43.961 [INFO][4034] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Namespace="calico-system" Pod="whisker-6f74d66c57-k5627" WorkloadEndpoint="localhost-k8s-whisker--6f74d66c57--k5627-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--6f74d66c57--k5627-eth0", GenerateName:"whisker-6f74d66c57-", Namespace:"calico-system", SelfLink:"", UID:"fce250c7-bc05-4407-b188-ad4970930c22", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f74d66c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-6f74d66c57-k5627", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib95e26ebddc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:43.989072 containerd[1618]: 2025-08-13 00:34:43.962 [INFO][4034] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Namespace="calico-system" Pod="whisker-6f74d66c57-k5627" WorkloadEndpoint="localhost-k8s-whisker--6f74d66c57--k5627-eth0" Aug 13 00:34:43.989148 containerd[1618]: 2025-08-13 00:34:43.962 [INFO][4034] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib95e26ebddc ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Namespace="calico-system" Pod="whisker-6f74d66c57-k5627" WorkloadEndpoint="localhost-k8s-whisker--6f74d66c57--k5627-eth0" Aug 13 00:34:43.989148 containerd[1618]: 2025-08-13 00:34:43.976 [INFO][4034] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Namespace="calico-system" Pod="whisker-6f74d66c57-k5627" WorkloadEndpoint="localhost-k8s-whisker--6f74d66c57--k5627-eth0" Aug 13 00:34:43.989193 containerd[1618]: 2025-08-13 00:34:43.977 [INFO][4034] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Namespace="calico-system" Pod="whisker-6f74d66c57-k5627" WorkloadEndpoint="localhost-k8s-whisker--6f74d66c57--k5627-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--6f74d66c57--k5627-eth0", GenerateName:"whisker-6f74d66c57-", Namespace:"calico-system", SelfLink:"", UID:"fce250c7-bc05-4407-b188-ad4970930c22", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f74d66c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd", Pod:"whisker-6f74d66c57-k5627", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib95e26ebddc", MAC:"86:81:62:fe:a6:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:43.989249 containerd[1618]: 2025-08-13 00:34:43.984 [INFO][4034] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" Namespace="calico-system" Pod="whisker-6f74d66c57-k5627" WorkloadEndpoint="localhost-k8s-whisker--6f74d66c57--k5627-eth0" Aug 13 00:34:44.097704 containerd[1618]: time="2025-08-13T00:34:44.097668056Z" level=info msg="connecting to shim 8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd" address="unix:///run/containerd/s/ff449f59f8b3bb44f93bdb567ef5257f095064b4dd9901736a1843331488ae17" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:44.122879 systemd[1]: Started cri-containerd-8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd.scope - libcontainer container 8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd. Aug 13 00:34:44.160935 systemd-resolved[1484]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 13 00:34:44.233997 containerd[1618]: time="2025-08-13T00:34:44.233867605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f74d66c57-k5627,Uid:fce250c7-bc05-4407-b188-ad4970930c22,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd\"" Aug 13 00:34:44.244810 containerd[1618]: time="2025-08-13T00:34:44.244784595Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 13 00:34:45.133186 systemd-networkd[1540]: calib95e26ebddc: Gained IPv6LL Aug 13 00:34:45.650026 containerd[1618]: time="2025-08-13T00:34:45.649983935Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:45.650328 containerd[1618]: time="2025-08-13T00:34:45.650101211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 13 00:34:45.650912 containerd[1618]: time="2025-08-13T00:34:45.650367728Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:45.651596 containerd[1618]: time="2025-08-13T00:34:45.651494847Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:45.652150 containerd[1618]: time="2025-08-13T00:34:45.651913051Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.407104742s" Aug 13 00:34:45.652150 containerd[1618]: time="2025-08-13T00:34:45.651930923Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 13 00:34:45.658074 containerd[1618]: time="2025-08-13T00:34:45.658049425Z" level=info msg="CreateContainer within sandbox \"8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 13 00:34:45.663550 containerd[1618]: time="2025-08-13T00:34:45.663511948Z" level=info msg="Container f3f18b38eb1fe3a19f31a99aeea4d2afa4d7161d7931f2b106c8005a7eca17af: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:45.666563 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3055398564.mount: Deactivated successfully. Aug 13 00:34:45.668523 containerd[1618]: time="2025-08-13T00:34:45.668503000Z" level=info msg="CreateContainer within sandbox \"8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"f3f18b38eb1fe3a19f31a99aeea4d2afa4d7161d7931f2b106c8005a7eca17af\"" Aug 13 00:34:45.668896 containerd[1618]: time="2025-08-13T00:34:45.668796811Z" level=info msg="StartContainer for \"f3f18b38eb1fe3a19f31a99aeea4d2afa4d7161d7931f2b106c8005a7eca17af\"" Aug 13 00:34:45.669449 containerd[1618]: time="2025-08-13T00:34:45.669432614Z" level=info msg="connecting to shim f3f18b38eb1fe3a19f31a99aeea4d2afa4d7161d7931f2b106c8005a7eca17af" address="unix:///run/containerd/s/ff449f59f8b3bb44f93bdb567ef5257f095064b4dd9901736a1843331488ae17" protocol=ttrpc version=3 Aug 13 00:34:45.684618 systemd[1]: Started cri-containerd-f3f18b38eb1fe3a19f31a99aeea4d2afa4d7161d7931f2b106c8005a7eca17af.scope - libcontainer container f3f18b38eb1fe3a19f31a99aeea4d2afa4d7161d7931f2b106c8005a7eca17af. Aug 13 00:34:45.716770 containerd[1618]: time="2025-08-13T00:34:45.716749929Z" level=info msg="StartContainer for \"f3f18b38eb1fe3a19f31a99aeea4d2afa4d7161d7931f2b106c8005a7eca17af\" returns successfully" Aug 13 00:34:45.717818 containerd[1618]: time="2025-08-13T00:34:45.717750934Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 13 00:34:45.802168 containerd[1618]: time="2025-08-13T00:34:45.802141622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-66m68,Uid:fe7d68d1-75a2-45b8-b95c-24db42f5e458,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:45.857589 systemd-networkd[1540]: cali59016908804: Link UP Aug 13 00:34:45.858011 systemd-networkd[1540]: cali59016908804: Gained carrier Aug 13 00:34:45.869586 containerd[1618]: 2025-08-13 00:34:45.817 [INFO][4259] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:45.869586 containerd[1618]: 2025-08-13 00:34:45.823 [INFO][4259] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--66m68-eth0 coredns-7c65d6cfc9- kube-system fe7d68d1-75a2-45b8-b95c-24db42f5e458 785 0 2025-08-13 00:34:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-66m68 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali59016908804 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Namespace="kube-system" Pod="coredns-7c65d6cfc9-66m68" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--66m68-" Aug 13 00:34:45.869586 containerd[1618]: 2025-08-13 00:34:45.823 [INFO][4259] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Namespace="kube-system" Pod="coredns-7c65d6cfc9-66m68" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" Aug 13 00:34:45.869586 containerd[1618]: 2025-08-13 00:34:45.838 [INFO][4272] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" HandleID="k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Workload="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.838 [INFO][4272] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" HandleID="k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Workload="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f610), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-66m68", "timestamp":"2025-08-13 00:34:45.838869966 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.839 [INFO][4272] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.839 [INFO][4272] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.839 [INFO][4272] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.842 [INFO][4272] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" host="localhost" Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.844 [INFO][4272] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.846 [INFO][4272] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.847 [INFO][4272] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.849 [INFO][4272] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:45.869912 containerd[1618]: 2025-08-13 00:34:45.849 [INFO][4272] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" host="localhost" Aug 13 00:34:45.871004 containerd[1618]: 2025-08-13 00:34:45.849 [INFO][4272] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa Aug 13 00:34:45.871004 containerd[1618]: 2025-08-13 00:34:45.851 [INFO][4272] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" host="localhost" Aug 13 00:34:45.871004 containerd[1618]: 2025-08-13 00:34:45.854 [INFO][4272] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" host="localhost" Aug 13 00:34:45.871004 containerd[1618]: 2025-08-13 00:34:45.854 [INFO][4272] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" host="localhost" Aug 13 00:34:45.871004 containerd[1618]: 2025-08-13 00:34:45.854 [INFO][4272] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:45.871004 containerd[1618]: 2025-08-13 00:34:45.854 [INFO][4272] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" HandleID="k8s-pod-network.8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Workload="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" Aug 13 00:34:45.871175 containerd[1618]: 2025-08-13 00:34:45.856 [INFO][4259] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Namespace="kube-system" Pod="coredns-7c65d6cfc9-66m68" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--66m68-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"fe7d68d1-75a2-45b8-b95c-24db42f5e458", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-66m68", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali59016908804", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:45.871244 containerd[1618]: 2025-08-13 00:34:45.856 [INFO][4259] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Namespace="kube-system" Pod="coredns-7c65d6cfc9-66m68" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" Aug 13 00:34:45.871244 containerd[1618]: 2025-08-13 00:34:45.856 [INFO][4259] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali59016908804 ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Namespace="kube-system" Pod="coredns-7c65d6cfc9-66m68" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" Aug 13 00:34:45.871244 containerd[1618]: 2025-08-13 00:34:45.858 [INFO][4259] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Namespace="kube-system" Pod="coredns-7c65d6cfc9-66m68" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" Aug 13 00:34:45.871320 containerd[1618]: 2025-08-13 00:34:45.858 [INFO][4259] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Namespace="kube-system" Pod="coredns-7c65d6cfc9-66m68" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--66m68-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"fe7d68d1-75a2-45b8-b95c-24db42f5e458", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa", Pod:"coredns-7c65d6cfc9-66m68", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali59016908804", MAC:"b6:f9:5c:44:51:5e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:45.871320 containerd[1618]: 2025-08-13 00:34:45.865 [INFO][4259] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" Namespace="kube-system" Pod="coredns-7c65d6cfc9-66m68" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--66m68-eth0" Aug 13 00:34:45.883793 containerd[1618]: time="2025-08-13T00:34:45.883771180Z" level=info msg="connecting to shim 8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa" address="unix:///run/containerd/s/de09723609ce8a6fa7ebbf535456a7b248f9d1279198c9128ba26c7192a71256" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:45.902651 systemd[1]: Started cri-containerd-8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa.scope - libcontainer container 8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa. Aug 13 00:34:45.910793 systemd-resolved[1484]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 13 00:34:45.935035 containerd[1618]: time="2025-08-13T00:34:45.934998026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-66m68,Uid:fe7d68d1-75a2-45b8-b95c-24db42f5e458,Namespace:kube-system,Attempt:0,} returns sandbox id \"8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa\"" Aug 13 00:34:45.936748 containerd[1618]: time="2025-08-13T00:34:45.936710118Z" level=info msg="CreateContainer within sandbox \"8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:34:45.981425 containerd[1618]: time="2025-08-13T00:34:45.981402302Z" level=info msg="Container b987a938a3991820eb0f5587053d2de1934977d8451b48c966eba504eb5a059d: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:45.983597 containerd[1618]: time="2025-08-13T00:34:45.983568888Z" level=info msg="CreateContainer within sandbox \"8aee0cde142f43d7d2a3cebcb8fcff740922124a37e66a0324ab299013645bfa\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b987a938a3991820eb0f5587053d2de1934977d8451b48c966eba504eb5a059d\"" Aug 13 00:34:45.984041 containerd[1618]: time="2025-08-13T00:34:45.983992406Z" level=info msg="StartContainer for \"b987a938a3991820eb0f5587053d2de1934977d8451b48c966eba504eb5a059d\"" Aug 13 00:34:45.985047 containerd[1618]: time="2025-08-13T00:34:45.985021996Z" level=info msg="connecting to shim b987a938a3991820eb0f5587053d2de1934977d8451b48c966eba504eb5a059d" address="unix:///run/containerd/s/de09723609ce8a6fa7ebbf535456a7b248f9d1279198c9128ba26c7192a71256" protocol=ttrpc version=3 Aug 13 00:34:46.002639 systemd[1]: Started cri-containerd-b987a938a3991820eb0f5587053d2de1934977d8451b48c966eba504eb5a059d.scope - libcontainer container b987a938a3991820eb0f5587053d2de1934977d8451b48c966eba504eb5a059d. Aug 13 00:34:46.022911 containerd[1618]: time="2025-08-13T00:34:46.022848083Z" level=info msg="StartContainer for \"b987a938a3991820eb0f5587053d2de1934977d8451b48c966eba504eb5a059d\" returns successfully" Aug 13 00:34:46.175329 kubelet[2927]: I0813 00:34:46.175045 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-66m68" podStartSLOduration=34.175030722 podStartE2EDuration="34.175030722s" podCreationTimestamp="2025-08-13 00:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:46.174243887 +0000 UTC m=+38.591519916" watchObservedRunningTime="2025-08-13 00:34:46.175030722 +0000 UTC m=+38.592306770" Aug 13 00:34:46.789298 containerd[1618]: time="2025-08-13T00:34:46.789271177Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45fb679-bvd9c,Uid:d50d2d1d-4a7e-454b-acc1-521b442ea412,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:46.851043 systemd-networkd[1540]: calieeedd484129: Link UP Aug 13 00:34:46.851292 systemd-networkd[1540]: calieeedd484129: Gained carrier Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.805 [INFO][4384] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.812 [INFO][4384] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0 calico-kube-controllers-7cd45fb679- calico-system d50d2d1d-4a7e-454b-acc1-521b442ea412 781 0 2025-08-13 00:34:24 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7cd45fb679 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7cd45fb679-bvd9c eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calieeedd484129 [] [] }} ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Namespace="calico-system" Pod="calico-kube-controllers-7cd45fb679-bvd9c" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.812 [INFO][4384] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Namespace="calico-system" Pod="calico-kube-controllers-7cd45fb679-bvd9c" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.826 [INFO][4396] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" HandleID="k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Workload="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.826 [INFO][4396] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" HandleID="k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Workload="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f000), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7cd45fb679-bvd9c", "timestamp":"2025-08-13 00:34:46.826688847 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.826 [INFO][4396] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.826 [INFO][4396] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.826 [INFO][4396] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.833 [INFO][4396] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.835 [INFO][4396] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.838 [INFO][4396] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.839 [INFO][4396] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.840 [INFO][4396] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.840 [INFO][4396] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.841 [INFO][4396] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4 Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.843 [INFO][4396] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.846 [INFO][4396] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.846 [INFO][4396] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" host="localhost" Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.846 [INFO][4396] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:46.864951 containerd[1618]: 2025-08-13 00:34:46.846 [INFO][4396] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" HandleID="k8s-pod-network.e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Workload="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" Aug 13 00:34:46.866772 containerd[1618]: 2025-08-13 00:34:46.847 [INFO][4384] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Namespace="calico-system" Pod="calico-kube-controllers-7cd45fb679-bvd9c" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0", GenerateName:"calico-kube-controllers-7cd45fb679-", Namespace:"calico-system", SelfLink:"", UID:"d50d2d1d-4a7e-454b-acc1-521b442ea412", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cd45fb679", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7cd45fb679-bvd9c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calieeedd484129", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:46.866772 containerd[1618]: 2025-08-13 00:34:46.848 [INFO][4384] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Namespace="calico-system" Pod="calico-kube-controllers-7cd45fb679-bvd9c" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" Aug 13 00:34:46.866772 containerd[1618]: 2025-08-13 00:34:46.848 [INFO][4384] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieeedd484129 ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Namespace="calico-system" Pod="calico-kube-controllers-7cd45fb679-bvd9c" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" Aug 13 00:34:46.866772 containerd[1618]: 2025-08-13 00:34:46.851 [INFO][4384] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Namespace="calico-system" Pod="calico-kube-controllers-7cd45fb679-bvd9c" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" Aug 13 00:34:46.866772 containerd[1618]: 2025-08-13 00:34:46.851 [INFO][4384] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Namespace="calico-system" Pod="calico-kube-controllers-7cd45fb679-bvd9c" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0", GenerateName:"calico-kube-controllers-7cd45fb679-", Namespace:"calico-system", SelfLink:"", UID:"d50d2d1d-4a7e-454b-acc1-521b442ea412", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cd45fb679", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4", Pod:"calico-kube-controllers-7cd45fb679-bvd9c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calieeedd484129", MAC:"ae:74:fa:24:bd:ce", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:46.866772 containerd[1618]: 2025-08-13 00:34:46.862 [INFO][4384] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" Namespace="calico-system" Pod="calico-kube-controllers-7cd45fb679-bvd9c" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45fb679--bvd9c-eth0" Aug 13 00:34:46.884898 containerd[1618]: time="2025-08-13T00:34:46.884871388Z" level=info msg="connecting to shim e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4" address="unix:///run/containerd/s/76326d42fe275184a7e09482e917ecf5e8eeb3acd8c6555b56dcafbdfa5bc515" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:46.910651 systemd[1]: Started cri-containerd-e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4.scope - libcontainer container e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4. Aug 13 00:34:46.919334 systemd-resolved[1484]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 13 00:34:46.923200 systemd-networkd[1540]: cali59016908804: Gained IPv6LL Aug 13 00:34:46.954664 containerd[1618]: time="2025-08-13T00:34:46.954603558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45fb679-bvd9c,Uid:d50d2d1d-4a7e-454b-acc1-521b442ea412,Namespace:calico-system,Attempt:0,} returns sandbox id \"e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4\"" Aug 13 00:34:47.664039 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3021482570.mount: Deactivated successfully. Aug 13 00:34:47.665700 containerd[1618]: time="2025-08-13T00:34:47.665676922Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.666094 containerd[1618]: time="2025-08-13T00:34:47.666078065Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 13 00:34:47.666293 containerd[1618]: time="2025-08-13T00:34:47.666113435Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.667456 containerd[1618]: time="2025-08-13T00:34:47.667444277Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.667816 containerd[1618]: time="2025-08-13T00:34:47.667800517Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 1.950003603s" Aug 13 00:34:47.668390 containerd[1618]: time="2025-08-13T00:34:47.667818544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 13 00:34:47.668499 containerd[1618]: time="2025-08-13T00:34:47.668483563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 13 00:34:47.669670 containerd[1618]: time="2025-08-13T00:34:47.669328936Z" level=info msg="CreateContainer within sandbox \"8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 13 00:34:47.674668 containerd[1618]: time="2025-08-13T00:34:47.674644775Z" level=info msg="Container 0486aa389e780b8440dc05eae1e99529292a975f2895ea87a3567c63e2f2ca14: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:47.675864 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount920923039.mount: Deactivated successfully. Aug 13 00:34:47.691366 containerd[1618]: time="2025-08-13T00:34:47.691344891Z" level=info msg="CreateContainer within sandbox \"8d049244ed0681df72745a3416a48afb3ccac85bcc642be883390fdc53fec4dd\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"0486aa389e780b8440dc05eae1e99529292a975f2895ea87a3567c63e2f2ca14\"" Aug 13 00:34:47.692128 containerd[1618]: time="2025-08-13T00:34:47.692032184Z" level=info msg="StartContainer for \"0486aa389e780b8440dc05eae1e99529292a975f2895ea87a3567c63e2f2ca14\"" Aug 13 00:34:47.692765 containerd[1618]: time="2025-08-13T00:34:47.692752757Z" level=info msg="connecting to shim 0486aa389e780b8440dc05eae1e99529292a975f2895ea87a3567c63e2f2ca14" address="unix:///run/containerd/s/ff449f59f8b3bb44f93bdb567ef5257f095064b4dd9901736a1843331488ae17" protocol=ttrpc version=3 Aug 13 00:34:47.711047 systemd[1]: Started cri-containerd-0486aa389e780b8440dc05eae1e99529292a975f2895ea87a3567c63e2f2ca14.scope - libcontainer container 0486aa389e780b8440dc05eae1e99529292a975f2895ea87a3567c63e2f2ca14. Aug 13 00:34:47.760396 containerd[1618]: time="2025-08-13T00:34:47.760377263Z" level=info msg="StartContainer for \"0486aa389e780b8440dc05eae1e99529292a975f2895ea87a3567c63e2f2ca14\" returns successfully" Aug 13 00:34:47.794764 containerd[1618]: time="2025-08-13T00:34:47.794743235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-c4ngc,Uid:f9cd0b21-5cc1-4092-9e3a-778a5727c93b,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:47.862437 systemd-networkd[1540]: calia6001a95f89: Link UP Aug 13 00:34:47.863765 systemd-networkd[1540]: calia6001a95f89: Gained carrier Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.815 [INFO][4520] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.822 [INFO][4520] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0 goldmane-58fd7646b9- calico-system f9cd0b21-5cc1-4092-9e3a-778a5727c93b 789 0 2025-08-13 00:34:22 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-58fd7646b9-c4ngc eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia6001a95f89 [] [] }} ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Namespace="calico-system" Pod="goldmane-58fd7646b9-c4ngc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--c4ngc-" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.822 [INFO][4520] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Namespace="calico-system" Pod="goldmane-58fd7646b9-c4ngc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.839 [INFO][4532] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" HandleID="k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Workload="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.839 [INFO][4532] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" HandleID="k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Workload="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f1b0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-58fd7646b9-c4ngc", "timestamp":"2025-08-13 00:34:47.839308874 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.839 [INFO][4532] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.839 [INFO][4532] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.839 [INFO][4532] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.844 [INFO][4532] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.847 [INFO][4532] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.850 [INFO][4532] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.851 [INFO][4532] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.852 [INFO][4532] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.852 [INFO][4532] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.853 [INFO][4532] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.855 [INFO][4532] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.859 [INFO][4532] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.859 [INFO][4532] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" host="localhost" Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.859 [INFO][4532] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:47.875392 containerd[1618]: 2025-08-13 00:34:47.859 [INFO][4532] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" HandleID="k8s-pod-network.69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Workload="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" Aug 13 00:34:47.877180 containerd[1618]: 2025-08-13 00:34:47.860 [INFO][4520] cni-plugin/k8s.go 418: Populated endpoint ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Namespace="calico-system" Pod="goldmane-58fd7646b9-c4ngc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"f9cd0b21-5cc1-4092-9e3a-778a5727c93b", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-58fd7646b9-c4ngc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia6001a95f89", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:47.877180 containerd[1618]: 2025-08-13 00:34:47.860 [INFO][4520] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Namespace="calico-system" Pod="goldmane-58fd7646b9-c4ngc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" Aug 13 00:34:47.877180 containerd[1618]: 2025-08-13 00:34:47.860 [INFO][4520] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia6001a95f89 ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Namespace="calico-system" Pod="goldmane-58fd7646b9-c4ngc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" Aug 13 00:34:47.877180 containerd[1618]: 2025-08-13 00:34:47.864 [INFO][4520] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Namespace="calico-system" Pod="goldmane-58fd7646b9-c4ngc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" Aug 13 00:34:47.877180 containerd[1618]: 2025-08-13 00:34:47.864 [INFO][4520] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Namespace="calico-system" Pod="goldmane-58fd7646b9-c4ngc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"f9cd0b21-5cc1-4092-9e3a-778a5727c93b", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e", Pod:"goldmane-58fd7646b9-c4ngc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia6001a95f89", MAC:"1a:9b:53:d4:0c:17", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:47.877180 containerd[1618]: 2025-08-13 00:34:47.870 [INFO][4520] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" Namespace="calico-system" Pod="goldmane-58fd7646b9-c4ngc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--c4ngc-eth0" Aug 13 00:34:47.928974 containerd[1618]: time="2025-08-13T00:34:47.928047244Z" level=info msg="connecting to shim 69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e" address="unix:///run/containerd/s/a7bd0a69082383405bd2eb95df16c0e3b82cdeaa448800dd0000847a0cc9d383" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:47.943623 systemd[1]: Started cri-containerd-69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e.scope - libcontainer container 69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e. Aug 13 00:34:47.952956 systemd-resolved[1484]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 13 00:34:47.981524 containerd[1618]: time="2025-08-13T00:34:47.981459789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-c4ngc,Uid:f9cd0b21-5cc1-4092-9e3a-778a5727c93b,Namespace:calico-system,Attempt:0,} returns sandbox id \"69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e\"" Aug 13 00:34:48.179436 kubelet[2927]: I0813 00:34:48.179267 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6f74d66c57-k5627" podStartSLOduration=1.749933027 podStartE2EDuration="5.17925178s" podCreationTimestamp="2025-08-13 00:34:43 +0000 UTC" firstStartedPulling="2025-08-13 00:34:44.239021808 +0000 UTC m=+36.656297837" lastFinishedPulling="2025-08-13 00:34:47.668340561 +0000 UTC m=+40.085616590" observedRunningTime="2025-08-13 00:34:48.178587875 +0000 UTC m=+40.595863914" watchObservedRunningTime="2025-08-13 00:34:48.17925178 +0000 UTC m=+40.596527810" Aug 13 00:34:48.330629 systemd-networkd[1540]: calieeedd484129: Gained IPv6LL Aug 13 00:34:48.789386 containerd[1618]: time="2025-08-13T00:34:48.789235155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9db99d59-mnjp4,Uid:895d523e-e692-449e-960d-7285bbad6a88,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:34:48.789386 containerd[1618]: time="2025-08-13T00:34:48.789235178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hz27z,Uid:1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:48.789620 containerd[1618]: time="2025-08-13T00:34:48.789601630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9db99d59-fb8ql,Uid:03280d1f-7157-4e73-820c-3c79157bccae,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:34:49.034692 systemd-networkd[1540]: calia6001a95f89: Gained IPv6LL Aug 13 00:34:49.146496 systemd-networkd[1540]: cali0c9bf200e53: Link UP Aug 13 00:34:49.147143 systemd-networkd[1540]: cali0c9bf200e53: Gained carrier Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:48.951 [INFO][4621] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:48.980 [INFO][4621] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0 coredns-7c65d6cfc9- kube-system 1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222 792 0 2025-08-13 00:34:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-hz27z eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0c9bf200e53 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hz27z" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hz27z-" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:48.980 [INFO][4621] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hz27z" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.018 [INFO][4658] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" HandleID="k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Workload="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.018 [INFO][4658] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" HandleID="k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Workload="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd9d0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-hz27z", "timestamp":"2025-08-13 00:34:49.017948833 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.018 [INFO][4658] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.018 [INFO][4658] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.018 [INFO][4658] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.027 [INFO][4658] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.068 [INFO][4658] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.072 [INFO][4658] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.073 [INFO][4658] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.075 [INFO][4658] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.075 [INFO][4658] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.076 [INFO][4658] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49 Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.085 [INFO][4658] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.115 [INFO][4658] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.115 [INFO][4658] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" host="localhost" Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.115 [INFO][4658] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:49.160748 containerd[1618]: 2025-08-13 00:34:49.115 [INFO][4658] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" HandleID="k8s-pod-network.edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Workload="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" Aug 13 00:34:49.172671 containerd[1618]: 2025-08-13 00:34:49.119 [INFO][4621] cni-plugin/k8s.go 418: Populated endpoint ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hz27z" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-hz27z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0c9bf200e53", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:49.172671 containerd[1618]: 2025-08-13 00:34:49.119 [INFO][4621] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hz27z" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" Aug 13 00:34:49.172671 containerd[1618]: 2025-08-13 00:34:49.119 [INFO][4621] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0c9bf200e53 ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hz27z" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" Aug 13 00:34:49.172671 containerd[1618]: 2025-08-13 00:34:49.147 [INFO][4621] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hz27z" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" Aug 13 00:34:49.172671 containerd[1618]: 2025-08-13 00:34:49.147 [INFO][4621] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hz27z" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49", Pod:"coredns-7c65d6cfc9-hz27z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0c9bf200e53", MAC:"ae:2f:4b:dc:01:19", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:49.172671 containerd[1618]: 2025-08-13 00:34:49.159 [INFO][4621] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hz27z" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hz27z-eth0" Aug 13 00:34:49.193835 systemd-networkd[1540]: caliaf312a9036b: Link UP Aug 13 00:34:49.194862 systemd-networkd[1540]: caliaf312a9036b: Gained carrier Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:48.978 [INFO][4636] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.003 [INFO][4636] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0 calico-apiserver-b9db99d59- calico-apiserver 03280d1f-7157-4e73-820c-3c79157bccae 791 0 2025-08-13 00:34:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b9db99d59 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-b9db99d59-fb8ql eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliaf312a9036b [] [] }} ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-fb8ql" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.003 [INFO][4636] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-fb8ql" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.045 [INFO][4669] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" HandleID="k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Workload="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.045 [INFO][4669] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" HandleID="k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Workload="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-b9db99d59-fb8ql", "timestamp":"2025-08-13 00:34:49.045003148 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.045 [INFO][4669] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.115 [INFO][4669] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.115 [INFO][4669] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.128 [INFO][4669] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.168 [INFO][4669] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.174 [INFO][4669] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.175 [INFO][4669] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.177 [INFO][4669] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.177 [INFO][4669] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.178 [INFO][4669] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26 Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.181 [INFO][4669] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.186 [INFO][4669] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.186 [INFO][4669] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" host="localhost" Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.186 [INFO][4669] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:49.212081 containerd[1618]: 2025-08-13 00:34:49.186 [INFO][4669] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" HandleID="k8s-pod-network.891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Workload="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" Aug 13 00:34:49.215063 containerd[1618]: 2025-08-13 00:34:49.189 [INFO][4636] cni-plugin/k8s.go 418: Populated endpoint ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-fb8ql" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0", GenerateName:"calico-apiserver-b9db99d59-", Namespace:"calico-apiserver", SelfLink:"", UID:"03280d1f-7157-4e73-820c-3c79157bccae", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b9db99d59", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-b9db99d59-fb8ql", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaf312a9036b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:49.215063 containerd[1618]: 2025-08-13 00:34:49.189 [INFO][4636] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-fb8ql" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" Aug 13 00:34:49.215063 containerd[1618]: 2025-08-13 00:34:49.189 [INFO][4636] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaf312a9036b ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-fb8ql" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" Aug 13 00:34:49.215063 containerd[1618]: 2025-08-13 00:34:49.195 [INFO][4636] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-fb8ql" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" Aug 13 00:34:49.215063 containerd[1618]: 2025-08-13 00:34:49.195 [INFO][4636] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-fb8ql" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0", GenerateName:"calico-apiserver-b9db99d59-", Namespace:"calico-apiserver", SelfLink:"", UID:"03280d1f-7157-4e73-820c-3c79157bccae", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b9db99d59", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26", Pod:"calico-apiserver-b9db99d59-fb8ql", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaf312a9036b", MAC:"ce:05:bc:31:c4:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:49.215063 containerd[1618]: 2025-08-13 00:34:49.210 [INFO][4636] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-fb8ql" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--fb8ql-eth0" Aug 13 00:34:49.302998 systemd-networkd[1540]: calia943fa2dc8c: Link UP Aug 13 00:34:49.303521 systemd-networkd[1540]: calia943fa2dc8c: Gained carrier Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:48.959 [INFO][4628] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:48.980 [INFO][4628] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0 calico-apiserver-b9db99d59- calico-apiserver 895d523e-e692-449e-960d-7285bbad6a88 790 0 2025-08-13 00:34:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b9db99d59 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-b9db99d59-mnjp4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia943fa2dc8c [] [] }} ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-mnjp4" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:48.981 [INFO][4628] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-mnjp4" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.056 [INFO][4660] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" HandleID="k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Workload="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.056 [INFO][4660] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" HandleID="k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Workload="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032d5e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-b9db99d59-mnjp4", "timestamp":"2025-08-13 00:34:49.056583344 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.056 [INFO][4660] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.186 [INFO][4660] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.186 [INFO][4660] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.228 [INFO][4660] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.269 [INFO][4660] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.272 [INFO][4660] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.273 [INFO][4660] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.275 [INFO][4660] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.275 [INFO][4660] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.276 [INFO][4660] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555 Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.281 [INFO][4660] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.292 [INFO][4660] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.292 [INFO][4660] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" host="localhost" Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.292 [INFO][4660] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:49.311900 containerd[1618]: 2025-08-13 00:34:49.292 [INFO][4660] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" HandleID="k8s-pod-network.71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Workload="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" Aug 13 00:34:49.312363 containerd[1618]: 2025-08-13 00:34:49.301 [INFO][4628] cni-plugin/k8s.go 418: Populated endpoint ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-mnjp4" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0", GenerateName:"calico-apiserver-b9db99d59-", Namespace:"calico-apiserver", SelfLink:"", UID:"895d523e-e692-449e-960d-7285bbad6a88", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b9db99d59", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-b9db99d59-mnjp4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia943fa2dc8c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:49.312363 containerd[1618]: 2025-08-13 00:34:49.301 [INFO][4628] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-mnjp4" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" Aug 13 00:34:49.312363 containerd[1618]: 2025-08-13 00:34:49.301 [INFO][4628] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia943fa2dc8c ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-mnjp4" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" Aug 13 00:34:49.312363 containerd[1618]: 2025-08-13 00:34:49.303 [INFO][4628] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-mnjp4" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" Aug 13 00:34:49.312363 containerd[1618]: 2025-08-13 00:34:49.303 [INFO][4628] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-mnjp4" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0", GenerateName:"calico-apiserver-b9db99d59-", Namespace:"calico-apiserver", SelfLink:"", UID:"895d523e-e692-449e-960d-7285bbad6a88", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b9db99d59", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555", Pod:"calico-apiserver-b9db99d59-mnjp4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia943fa2dc8c", MAC:"f2:78:5c:47:52:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:49.312363 containerd[1618]: 2025-08-13 00:34:49.308 [INFO][4628] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" Namespace="calico-apiserver" Pod="calico-apiserver-b9db99d59-mnjp4" WorkloadEndpoint="localhost-k8s-calico--apiserver--b9db99d59--mnjp4-eth0" Aug 13 00:34:49.448819 containerd[1618]: time="2025-08-13T00:34:49.448785839Z" level=info msg="connecting to shim edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49" address="unix:///run/containerd/s/86223c92ef64f017fc9f170179c92180a78d621b8786fc4ccf84257b095db2c8" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:49.454552 containerd[1618]: time="2025-08-13T00:34:49.454509591Z" level=info msg="connecting to shim 891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26" address="unix:///run/containerd/s/2c1a9f0ee767495be7711bafc4a7e28dbb0af0dc8d1e32416db44d1e5d3d999a" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:49.474848 containerd[1618]: time="2025-08-13T00:34:49.474824336Z" level=info msg="connecting to shim 71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555" address="unix:///run/containerd/s/904227d4f0bd20686e2d945424e29827a6d2fe5d0b5629432b15c911435eb69d" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:49.475703 systemd[1]: Started cri-containerd-edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49.scope - libcontainer container edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49. Aug 13 00:34:49.488683 systemd[1]: Started cri-containerd-891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26.scope - libcontainer container 891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26. Aug 13 00:34:49.492678 systemd-resolved[1484]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 13 00:34:49.508934 systemd[1]: Started cri-containerd-71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555.scope - libcontainer container 71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555. Aug 13 00:34:49.529874 systemd-resolved[1484]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 13 00:34:49.534772 systemd-resolved[1484]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 13 00:34:49.553202 containerd[1618]: time="2025-08-13T00:34:49.553108292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hz27z,Uid:1041bee9-c09a-4ab4-b0d0-aa6d4e1e0222,Namespace:kube-system,Attempt:0,} returns sandbox id \"edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49\"" Aug 13 00:34:49.570523 containerd[1618]: time="2025-08-13T00:34:49.570499643Z" level=info msg="CreateContainer within sandbox \"edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:34:49.584363 containerd[1618]: time="2025-08-13T00:34:49.584266840Z" level=info msg="Container fcc436d9715b62d0950ee734ad8f99fe798e09571728763871d14e1da25e279d: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:49.591407 containerd[1618]: time="2025-08-13T00:34:49.591388054Z" level=info msg="CreateContainer within sandbox \"edb6a9ee6d518cc27b3171e932208d05911fdd7b97d1f90ed85735853c03bd49\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fcc436d9715b62d0950ee734ad8f99fe798e09571728763871d14e1da25e279d\"" Aug 13 00:34:49.592911 containerd[1618]: time="2025-08-13T00:34:49.592899047Z" level=info msg="StartContainer for \"fcc436d9715b62d0950ee734ad8f99fe798e09571728763871d14e1da25e279d\"" Aug 13 00:34:49.594448 containerd[1618]: time="2025-08-13T00:34:49.594425594Z" level=info msg="connecting to shim fcc436d9715b62d0950ee734ad8f99fe798e09571728763871d14e1da25e279d" address="unix:///run/containerd/s/86223c92ef64f017fc9f170179c92180a78d621b8786fc4ccf84257b095db2c8" protocol=ttrpc version=3 Aug 13 00:34:49.617280 containerd[1618]: time="2025-08-13T00:34:49.616793214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9db99d59-fb8ql,Uid:03280d1f-7157-4e73-820c-3c79157bccae,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26\"" Aug 13 00:34:49.618838 containerd[1618]: time="2025-08-13T00:34:49.618756300Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9db99d59-mnjp4,Uid:895d523e-e692-449e-960d-7285bbad6a88,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555\"" Aug 13 00:34:49.622675 systemd[1]: Started cri-containerd-fcc436d9715b62d0950ee734ad8f99fe798e09571728763871d14e1da25e279d.scope - libcontainer container fcc436d9715b62d0950ee734ad8f99fe798e09571728763871d14e1da25e279d. Aug 13 00:34:49.671371 containerd[1618]: time="2025-08-13T00:34:49.671308432Z" level=info msg="StartContainer for \"fcc436d9715b62d0950ee734ad8f99fe798e09571728763871d14e1da25e279d\" returns successfully" Aug 13 00:34:50.178477 containerd[1618]: time="2025-08-13T00:34:50.178446258Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:50.178951 containerd[1618]: time="2025-08-13T00:34:50.178794245Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 13 00:34:50.179129 containerd[1618]: time="2025-08-13T00:34:50.179084390Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:50.182118 containerd[1618]: time="2025-08-13T00:34:50.182086938Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:50.182900 containerd[1618]: time="2025-08-13T00:34:50.182877707Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 2.514292341s" Aug 13 00:34:50.182900 containerd[1618]: time="2025-08-13T00:34:50.182901147Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 13 00:34:50.184127 containerd[1618]: time="2025-08-13T00:34:50.184110377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 13 00:34:50.187249 systemd-networkd[1540]: cali0c9bf200e53: Gained IPv6LL Aug 13 00:34:50.215261 containerd[1618]: time="2025-08-13T00:34:50.215214153Z" level=info msg="CreateContainer within sandbox \"e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 13 00:34:50.221317 kubelet[2927]: I0813 00:34:50.219309 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-hz27z" podStartSLOduration=38.21929347 podStartE2EDuration="38.21929347s" podCreationTimestamp="2025-08-13 00:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:50.188138325 +0000 UTC m=+42.605414356" watchObservedRunningTime="2025-08-13 00:34:50.21929347 +0000 UTC m=+42.636569509" Aug 13 00:34:50.257007 containerd[1618]: time="2025-08-13T00:34:50.255778533Z" level=info msg="Container 7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:50.261080 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3872260920.mount: Deactivated successfully. Aug 13 00:34:50.268363 containerd[1618]: time="2025-08-13T00:34:50.268323972Z" level=info msg="CreateContainer within sandbox \"e2b5ab22d0a740e1bf9298e3fec42edad88ffb346b221fd1889a456b4bfce1c4\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\"" Aug 13 00:34:50.269263 containerd[1618]: time="2025-08-13T00:34:50.268701581Z" level=info msg="StartContainer for \"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\"" Aug 13 00:34:50.269874 containerd[1618]: time="2025-08-13T00:34:50.269838368Z" level=info msg="connecting to shim 7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593" address="unix:///run/containerd/s/76326d42fe275184a7e09482e917ecf5e8eeb3acd8c6555b56dcafbdfa5bc515" protocol=ttrpc version=3 Aug 13 00:34:50.290647 systemd[1]: Started cri-containerd-7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593.scope - libcontainer container 7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593. Aug 13 00:34:50.352405 containerd[1618]: time="2025-08-13T00:34:50.352364681Z" level=info msg="StartContainer for \"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\" returns successfully" Aug 13 00:34:50.790548 containerd[1618]: time="2025-08-13T00:34:50.789880601Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-24x4b,Uid:13ac1461-2756-4b70-ab78-2c6b6a45f530,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:50.925692 systemd-networkd[1540]: calidd04705ea97: Link UP Aug 13 00:34:50.925925 systemd-networkd[1540]: calidd04705ea97: Gained carrier Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.831 [INFO][4957] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.838 [INFO][4957] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--24x4b-eth0 csi-node-driver- calico-system 13ac1461-2756-4b70-ab78-2c6b6a45f530 680 0 2025-08-13 00:34:23 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-24x4b eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calidd04705ea97 [] [] }} ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Namespace="calico-system" Pod="csi-node-driver-24x4b" WorkloadEndpoint="localhost-k8s-csi--node--driver--24x4b-" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.838 [INFO][4957] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Namespace="calico-system" Pod="csi-node-driver-24x4b" WorkloadEndpoint="localhost-k8s-csi--node--driver--24x4b-eth0" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.869 [INFO][4973] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" HandleID="k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Workload="localhost-k8s-csi--node--driver--24x4b-eth0" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.869 [INFO][4973] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" HandleID="k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Workload="localhost-k8s-csi--node--driver--24x4b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f7f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-24x4b", "timestamp":"2025-08-13 00:34:50.869295257 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.869 [INFO][4973] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.869 [INFO][4973] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.869 [INFO][4973] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.874 [INFO][4973] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.885 [INFO][4973] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.888 [INFO][4973] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.889 [INFO][4973] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.890 [INFO][4973] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.890 [INFO][4973] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.891 [INFO][4973] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.905 [INFO][4973] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.921 [INFO][4973] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.921 [INFO][4973] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" host="localhost" Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.921 [INFO][4973] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:50.936811 containerd[1618]: 2025-08-13 00:34:50.921 [INFO][4973] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" HandleID="k8s-pod-network.6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Workload="localhost-k8s-csi--node--driver--24x4b-eth0" Aug 13 00:34:50.947087 containerd[1618]: 2025-08-13 00:34:50.923 [INFO][4957] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Namespace="calico-system" Pod="csi-node-driver-24x4b" WorkloadEndpoint="localhost-k8s-csi--node--driver--24x4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--24x4b-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"13ac1461-2756-4b70-ab78-2c6b6a45f530", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-24x4b", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidd04705ea97", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:50.947087 containerd[1618]: 2025-08-13 00:34:50.923 [INFO][4957] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Namespace="calico-system" Pod="csi-node-driver-24x4b" WorkloadEndpoint="localhost-k8s-csi--node--driver--24x4b-eth0" Aug 13 00:34:50.947087 containerd[1618]: 2025-08-13 00:34:50.923 [INFO][4957] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidd04705ea97 ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Namespace="calico-system" Pod="csi-node-driver-24x4b" WorkloadEndpoint="localhost-k8s-csi--node--driver--24x4b-eth0" Aug 13 00:34:50.947087 containerd[1618]: 2025-08-13 00:34:50.925 [INFO][4957] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Namespace="calico-system" Pod="csi-node-driver-24x4b" WorkloadEndpoint="localhost-k8s-csi--node--driver--24x4b-eth0" Aug 13 00:34:50.947087 containerd[1618]: 2025-08-13 00:34:50.925 [INFO][4957] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Namespace="calico-system" Pod="csi-node-driver-24x4b" WorkloadEndpoint="localhost-k8s-csi--node--driver--24x4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--24x4b-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"13ac1461-2756-4b70-ab78-2c6b6a45f530", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d", Pod:"csi-node-driver-24x4b", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidd04705ea97", MAC:"76:8d:5c:85:69:50", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:50.947087 containerd[1618]: 2025-08-13 00:34:50.932 [INFO][4957] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" Namespace="calico-system" Pod="csi-node-driver-24x4b" WorkloadEndpoint="localhost-k8s-csi--node--driver--24x4b-eth0" Aug 13 00:34:50.971506 containerd[1618]: time="2025-08-13T00:34:50.971475126Z" level=info msg="connecting to shim 6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d" address="unix:///run/containerd/s/8ac56ecd458d47bb372c83830c3100acc12de4433907c7883e960fc30e6e42da" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:50.994677 systemd[1]: Started cri-containerd-6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d.scope - libcontainer container 6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d. Aug 13 00:34:51.003378 systemd-resolved[1484]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 13 00:34:51.013219 containerd[1618]: time="2025-08-13T00:34:51.013198747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-24x4b,Uid:13ac1461-2756-4b70-ab78-2c6b6a45f530,Namespace:calico-system,Attempt:0,} returns sandbox id \"6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d\"" Aug 13 00:34:51.146712 systemd-networkd[1540]: caliaf312a9036b: Gained IPv6LL Aug 13 00:34:51.210649 systemd-networkd[1540]: calia943fa2dc8c: Gained IPv6LL Aug 13 00:34:52.204497 kubelet[2927]: I0813 00:34:52.204460 2927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:52.234774 systemd-networkd[1540]: calidd04705ea97: Gained IPv6LL Aug 13 00:34:52.564052 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3816236919.mount: Deactivated successfully. Aug 13 00:34:52.782455 kubelet[2927]: I0813 00:34:52.782430 2927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:52.994259 kubelet[2927]: I0813 00:34:52.994212 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7cd45fb679-bvd9c" podStartSLOduration=25.761127254 podStartE2EDuration="28.989316984s" podCreationTimestamp="2025-08-13 00:34:24 +0000 UTC" firstStartedPulling="2025-08-13 00:34:46.955233474 +0000 UTC m=+39.372509503" lastFinishedPulling="2025-08-13 00:34:50.183423203 +0000 UTC m=+42.600699233" observedRunningTime="2025-08-13 00:34:51.219428382 +0000 UTC m=+43.636704419" watchObservedRunningTime="2025-08-13 00:34:52.989316984 +0000 UTC m=+45.406593017" Aug 13 00:34:53.446903 kubelet[2927]: I0813 00:34:53.446863 2927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:53.632027 containerd[1618]: time="2025-08-13T00:34:53.631963143Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\" id:\"c7c4bf2d3c1810105e009b26235cd333392c0eff067b886fc576ad9c857dc96e\" pid:5101 exited_at:{seconds:1755045293 nanos:599304459}" Aug 13 00:34:53.671253 containerd[1618]: time="2025-08-13T00:34:53.671205029Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\" id:\"eb8fe32280edd056c1b109c145060cb4d430082633ccf9644bb58700fa8daab9\" pid:5124 exited_at:{seconds:1755045293 nanos:670639820}" Aug 13 00:34:53.974786 containerd[1618]: time="2025-08-13T00:34:53.974616983Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:53.982665 containerd[1618]: time="2025-08-13T00:34:53.982490051Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 13 00:34:54.006109 containerd[1618]: time="2025-08-13T00:34:54.006003239Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:54.007663 containerd[1618]: time="2025-08-13T00:34:54.007643779Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:54.010057 containerd[1618]: time="2025-08-13T00:34:54.009414859Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.824152905s" Aug 13 00:34:54.010057 containerd[1618]: time="2025-08-13T00:34:54.009434994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 13 00:34:54.017609 containerd[1618]: time="2025-08-13T00:34:54.017496915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:34:54.024635 containerd[1618]: time="2025-08-13T00:34:54.024607423Z" level=info msg="CreateContainer within sandbox \"69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 13 00:34:54.070676 containerd[1618]: time="2025-08-13T00:34:54.070652021Z" level=info msg="Container e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:54.072561 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3090787708.mount: Deactivated successfully. Aug 13 00:34:54.093767 containerd[1618]: time="2025-08-13T00:34:54.093719526Z" level=info msg="CreateContainer within sandbox \"69055495c57b3e2732336e45aeb61291f3020686ec05b81611f3f86252a5fc0e\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\"" Aug 13 00:34:54.095394 containerd[1618]: time="2025-08-13T00:34:54.094227496Z" level=info msg="StartContainer for \"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\"" Aug 13 00:34:54.097231 containerd[1618]: time="2025-08-13T00:34:54.097174619Z" level=info msg="connecting to shim e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7" address="unix:///run/containerd/s/a7bd0a69082383405bd2eb95df16c0e3b82cdeaa448800dd0000847a0cc9d383" protocol=ttrpc version=3 Aug 13 00:34:54.154681 systemd[1]: Started cri-containerd-e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7.scope - libcontainer container e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7. Aug 13 00:34:54.217960 containerd[1618]: time="2025-08-13T00:34:54.217930403Z" level=info msg="StartContainer for \"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" returns successfully" Aug 13 00:34:54.499771 kubelet[2927]: I0813 00:34:54.499633 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-c4ngc" podStartSLOduration=26.470060137 podStartE2EDuration="32.499621292s" podCreationTimestamp="2025-08-13 00:34:22 +0000 UTC" firstStartedPulling="2025-08-13 00:34:47.982333077 +0000 UTC m=+40.399609105" lastFinishedPulling="2025-08-13 00:34:54.011894231 +0000 UTC m=+46.429170260" observedRunningTime="2025-08-13 00:34:54.499590705 +0000 UTC m=+46.916866745" watchObservedRunningTime="2025-08-13 00:34:54.499621292 +0000 UTC m=+46.916897330" Aug 13 00:34:54.564937 containerd[1618]: time="2025-08-13T00:34:54.564911952Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" id:\"c6e78b7610b978d80866315eaf39aa8146ca7acd84912e8d064a16eaf7d39095\" pid:5233 exit_status:1 exited_at:{seconds:1755045294 nanos:564675898}" Aug 13 00:34:54.677462 systemd-networkd[1540]: vxlan.calico: Link UP Aug 13 00:34:54.677468 systemd-networkd[1540]: vxlan.calico: Gained carrier Aug 13 00:34:55.565348 containerd[1618]: time="2025-08-13T00:34:55.565323826Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" id:\"869104b6d2d1b49dd16591fe006fb12ac77a3454d49558cb5b29c774e32ba07d\" pid:5336 exit_status:1 exited_at:{seconds:1755045295 nanos:564978883}" Aug 13 00:34:55.818654 systemd-networkd[1540]: vxlan.calico: Gained IPv6LL Aug 13 00:34:56.725441 containerd[1618]: time="2025-08-13T00:34:56.725314036Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" id:\"2f4c1703bbbdf997a843c3b5b6f0f822c119cf272cf45d37188efb7fd51457d5\" pid:5366 exit_status:1 exited_at:{seconds:1755045296 nanos:725037024}" Aug 13 00:34:56.867114 containerd[1618]: time="2025-08-13T00:34:56.866621431Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:56.867264 containerd[1618]: time="2025-08-13T00:34:56.867251410Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 13 00:34:56.868443 containerd[1618]: time="2025-08-13T00:34:56.868426962Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:56.869639 containerd[1618]: time="2025-08-13T00:34:56.869624332Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:56.870462 containerd[1618]: time="2025-08-13T00:34:56.870449441Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 2.852930578s" Aug 13 00:34:56.870518 containerd[1618]: time="2025-08-13T00:34:56.870506889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 00:34:56.908512 containerd[1618]: time="2025-08-13T00:34:56.908484263Z" level=info msg="CreateContainer within sandbox \"891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:34:56.910797 containerd[1618]: time="2025-08-13T00:34:56.910780619Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:34:56.922127 containerd[1618]: time="2025-08-13T00:34:56.921629710Z" level=info msg="Container e0f3eaff67fdb4ed958de77d69599df65add409f2f49198d6a236fb5733464d0: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:56.928049 containerd[1618]: time="2025-08-13T00:34:56.928018585Z" level=info msg="CreateContainer within sandbox \"891d150c2b58625c8dba1a933ad2a77d58b14154a204038d239572c05ac4bc26\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e0f3eaff67fdb4ed958de77d69599df65add409f2f49198d6a236fb5733464d0\"" Aug 13 00:34:56.929526 containerd[1618]: time="2025-08-13T00:34:56.929499994Z" level=info msg="StartContainer for \"e0f3eaff67fdb4ed958de77d69599df65add409f2f49198d6a236fb5733464d0\"" Aug 13 00:34:56.930673 containerd[1618]: time="2025-08-13T00:34:56.930642449Z" level=info msg="connecting to shim e0f3eaff67fdb4ed958de77d69599df65add409f2f49198d6a236fb5733464d0" address="unix:///run/containerd/s/2c1a9f0ee767495be7711bafc4a7e28dbb0af0dc8d1e32416db44d1e5d3d999a" protocol=ttrpc version=3 Aug 13 00:34:56.954658 systemd[1]: Started cri-containerd-e0f3eaff67fdb4ed958de77d69599df65add409f2f49198d6a236fb5733464d0.scope - libcontainer container e0f3eaff67fdb4ed958de77d69599df65add409f2f49198d6a236fb5733464d0. Aug 13 00:34:57.013461 containerd[1618]: time="2025-08-13T00:34:57.013394884Z" level=info msg="StartContainer for \"e0f3eaff67fdb4ed958de77d69599df65add409f2f49198d6a236fb5733464d0\" returns successfully" Aug 13 00:34:57.391211 containerd[1618]: time="2025-08-13T00:34:57.389867435Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:57.392949 containerd[1618]: time="2025-08-13T00:34:57.392739565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 13 00:34:57.394108 containerd[1618]: time="2025-08-13T00:34:57.394088042Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 483.288622ms" Aug 13 00:34:57.399638 containerd[1618]: time="2025-08-13T00:34:57.394187873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 00:34:57.405712 containerd[1618]: time="2025-08-13T00:34:57.405557274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 13 00:34:57.406336 containerd[1618]: time="2025-08-13T00:34:57.406181511Z" level=info msg="CreateContainer within sandbox \"71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:34:57.475752 containerd[1618]: time="2025-08-13T00:34:57.472855453Z" level=info msg="Container 063371c97a5e512614a2a48c3918230b65dc22eb899c4bf71b0a18d52bbf7a41: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:57.475272 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3564103250.mount: Deactivated successfully. Aug 13 00:34:57.482583 containerd[1618]: time="2025-08-13T00:34:57.482560416Z" level=info msg="CreateContainer within sandbox \"71c6325c3921d81447a5da7ad08f8c8b002ef041dd449f3afd3516ba9f7db555\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"063371c97a5e512614a2a48c3918230b65dc22eb899c4bf71b0a18d52bbf7a41\"" Aug 13 00:34:57.483093 containerd[1618]: time="2025-08-13T00:34:57.483064898Z" level=info msg="StartContainer for \"063371c97a5e512614a2a48c3918230b65dc22eb899c4bf71b0a18d52bbf7a41\"" Aug 13 00:34:57.487902 containerd[1618]: time="2025-08-13T00:34:57.487616788Z" level=info msg="connecting to shim 063371c97a5e512614a2a48c3918230b65dc22eb899c4bf71b0a18d52bbf7a41" address="unix:///run/containerd/s/904227d4f0bd20686e2d945424e29827a6d2fe5d0b5629432b15c911435eb69d" protocol=ttrpc version=3 Aug 13 00:34:57.518758 systemd[1]: Started cri-containerd-063371c97a5e512614a2a48c3918230b65dc22eb899c4bf71b0a18d52bbf7a41.scope - libcontainer container 063371c97a5e512614a2a48c3918230b65dc22eb899c4bf71b0a18d52bbf7a41. Aug 13 00:34:57.581951 containerd[1618]: time="2025-08-13T00:34:57.581906348Z" level=info msg="StartContainer for \"063371c97a5e512614a2a48c3918230b65dc22eb899c4bf71b0a18d52bbf7a41\" returns successfully" Aug 13 00:34:57.591317 kubelet[2927]: I0813 00:34:57.591155 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b9db99d59-fb8ql" podStartSLOduration=29.294555944 podStartE2EDuration="36.579726256s" podCreationTimestamp="2025-08-13 00:34:21 +0000 UTC" firstStartedPulling="2025-08-13 00:34:49.618900701 +0000 UTC m=+42.036176730" lastFinishedPulling="2025-08-13 00:34:56.904071012 +0000 UTC m=+49.321347042" observedRunningTime="2025-08-13 00:34:57.579189499 +0000 UTC m=+49.996465536" watchObservedRunningTime="2025-08-13 00:34:57.579726256 +0000 UTC m=+49.997002295" Aug 13 00:34:58.567454 kubelet[2927]: I0813 00:34:58.567414 2927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:58.584099 kubelet[2927]: I0813 00:34:58.581094 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b9db99d59-mnjp4" podStartSLOduration=29.801637222 podStartE2EDuration="37.574090578s" podCreationTimestamp="2025-08-13 00:34:21 +0000 UTC" firstStartedPulling="2025-08-13 00:34:49.62224494 +0000 UTC m=+42.039520969" lastFinishedPulling="2025-08-13 00:34:57.394698295 +0000 UTC m=+49.811974325" observedRunningTime="2025-08-13 00:34:58.572545706 +0000 UTC m=+50.989821739" watchObservedRunningTime="2025-08-13 00:34:58.574090578 +0000 UTC m=+50.991366611" Aug 13 00:35:00.322258 containerd[1618]: time="2025-08-13T00:35:00.322226162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:00.322726 containerd[1618]: time="2025-08-13T00:35:00.322645471Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 13 00:35:00.323006 containerd[1618]: time="2025-08-13T00:35:00.322988640Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:00.327332 containerd[1618]: time="2025-08-13T00:35:00.327312530Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:00.327794 containerd[1618]: time="2025-08-13T00:35:00.327778481Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 2.922195108s" Aug 13 00:35:00.327830 containerd[1618]: time="2025-08-13T00:35:00.327796303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 13 00:35:00.338166 containerd[1618]: time="2025-08-13T00:35:00.337686544Z" level=info msg="CreateContainer within sandbox \"6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 13 00:35:00.348082 containerd[1618]: time="2025-08-13T00:35:00.348051691Z" level=info msg="Container 0ee38ddfa2c525d28c0d35589cb20b5c28ea52153c5c1a93ebe5e22d30e2cb50: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:00.354970 containerd[1618]: time="2025-08-13T00:35:00.354940047Z" level=info msg="CreateContainer within sandbox \"6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"0ee38ddfa2c525d28c0d35589cb20b5c28ea52153c5c1a93ebe5e22d30e2cb50\"" Aug 13 00:35:00.355454 containerd[1618]: time="2025-08-13T00:35:00.355391126Z" level=info msg="StartContainer for \"0ee38ddfa2c525d28c0d35589cb20b5c28ea52153c5c1a93ebe5e22d30e2cb50\"" Aug 13 00:35:00.356194 containerd[1618]: time="2025-08-13T00:35:00.356177654Z" level=info msg="connecting to shim 0ee38ddfa2c525d28c0d35589cb20b5c28ea52153c5c1a93ebe5e22d30e2cb50" address="unix:///run/containerd/s/8ac56ecd458d47bb372c83830c3100acc12de4433907c7883e960fc30e6e42da" protocol=ttrpc version=3 Aug 13 00:35:00.404659 systemd[1]: Started cri-containerd-0ee38ddfa2c525d28c0d35589cb20b5c28ea52153c5c1a93ebe5e22d30e2cb50.scope - libcontainer container 0ee38ddfa2c525d28c0d35589cb20b5c28ea52153c5c1a93ebe5e22d30e2cb50. Aug 13 00:35:00.431139 containerd[1618]: time="2025-08-13T00:35:00.431113303Z" level=info msg="StartContainer for \"0ee38ddfa2c525d28c0d35589cb20b5c28ea52153c5c1a93ebe5e22d30e2cb50\" returns successfully" Aug 13 00:35:00.432573 containerd[1618]: time="2025-08-13T00:35:00.432333662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 13 00:35:02.644107 containerd[1618]: time="2025-08-13T00:35:02.644081048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:02.644805 containerd[1618]: time="2025-08-13T00:35:02.644487361Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 13 00:35:02.644805 containerd[1618]: time="2025-08-13T00:35:02.644783691Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:02.645872 containerd[1618]: time="2025-08-13T00:35:02.645860323Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:02.646238 containerd[1618]: time="2025-08-13T00:35:02.646221374Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 2.213869416s" Aug 13 00:35:02.646434 containerd[1618]: time="2025-08-13T00:35:02.646238919Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 13 00:35:02.688035 containerd[1618]: time="2025-08-13T00:35:02.688009940Z" level=info msg="CreateContainer within sandbox \"6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 13 00:35:02.732130 containerd[1618]: time="2025-08-13T00:35:02.732072626Z" level=info msg="Container 62da533db922850e6f8891213aa9cf8a70646747ce7952dd3a5075b68e3771a0: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:02.754339 containerd[1618]: time="2025-08-13T00:35:02.754315072Z" level=info msg="CreateContainer within sandbox \"6f59a2a3832d96a5d812d685a680f57a27a9b73896d469c899654c1edb2c768d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"62da533db922850e6f8891213aa9cf8a70646747ce7952dd3a5075b68e3771a0\"" Aug 13 00:35:02.798481 containerd[1618]: time="2025-08-13T00:35:02.798454489Z" level=info msg="StartContainer for \"62da533db922850e6f8891213aa9cf8a70646747ce7952dd3a5075b68e3771a0\"" Aug 13 00:35:02.799790 containerd[1618]: time="2025-08-13T00:35:02.799351763Z" level=info msg="connecting to shim 62da533db922850e6f8891213aa9cf8a70646747ce7952dd3a5075b68e3771a0" address="unix:///run/containerd/s/8ac56ecd458d47bb372c83830c3100acc12de4433907c7883e960fc30e6e42da" protocol=ttrpc version=3 Aug 13 00:35:02.822748 systemd[1]: Started cri-containerd-62da533db922850e6f8891213aa9cf8a70646747ce7952dd3a5075b68e3771a0.scope - libcontainer container 62da533db922850e6f8891213aa9cf8a70646747ce7952dd3a5075b68e3771a0. Aug 13 00:35:02.869721 containerd[1618]: time="2025-08-13T00:35:02.869694771Z" level=info msg="StartContainer for \"62da533db922850e6f8891213aa9cf8a70646747ce7952dd3a5075b68e3771a0\" returns successfully" Aug 13 00:35:03.996044 kubelet[2927]: I0813 00:35:03.989299 2927 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 13 00:35:03.997713 kubelet[2927]: I0813 00:35:03.997700 2927 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 13 00:35:04.500726 containerd[1618]: time="2025-08-13T00:35:04.500500271Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" id:\"fe0c56c1dc119c072323d324eec15ee634e8fdd7d069b714dcaac5dc205caf11\" pid:5547 exited_at:{seconds:1755045304 nanos:455710761}" Aug 13 00:35:11.221419 kubelet[2927]: I0813 00:35:11.221217 2927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:35:11.248057 kubelet[2927]: I0813 00:35:11.239668 2927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-24x4b" podStartSLOduration=36.565681619 podStartE2EDuration="48.236343967s" podCreationTimestamp="2025-08-13 00:34:23 +0000 UTC" firstStartedPulling="2025-08-13 00:34:51.014210244 +0000 UTC m=+43.431486273" lastFinishedPulling="2025-08-13 00:35:02.684872592 +0000 UTC m=+55.102148621" observedRunningTime="2025-08-13 00:35:03.764505965 +0000 UTC m=+56.181782013" watchObservedRunningTime="2025-08-13 00:35:11.236343967 +0000 UTC m=+63.653619999" Aug 13 00:35:12.337604 containerd[1618]: time="2025-08-13T00:35:12.337558287Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd\" id:\"184e521fd8801bdef4bb24642b0ebb88a480499fe8b1d57c8cb4339eb4018cdb\" pid:5593 exited_at:{seconds:1755045312 nanos:337282844}" Aug 13 00:35:16.692783 containerd[1618]: time="2025-08-13T00:35:16.692728790Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" id:\"6a02ca3402b09769f6f6e31fb6df2d5f84edce4bd08941fb03c5ab8f36a6c9aa\" pid:5629 exited_at:{seconds:1755045316 nanos:692408921}" Aug 13 00:35:19.876511 containerd[1618]: time="2025-08-13T00:35:19.876402537Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\" id:\"7eed89080ae52358eabfc2f501d7e658ede74baea17bc2fe675b6d7530f296e2\" pid:5656 exited_at:{seconds:1755045319 nanos:876123586}" Aug 13 00:35:23.307923 containerd[1618]: time="2025-08-13T00:35:23.307890057Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\" id:\"3428cc0d3182abb9df4fc88ea7aa5d45a78beccb821892e03c9f764b9059ee38\" pid:5678 exited_at:{seconds:1755045323 nanos:307623550}" Aug 13 00:35:30.762121 systemd[1]: Started sshd@7-139.178.70.101:22-147.75.109.163:41700.service - OpenSSH per-connection server daemon (147.75.109.163:41700). Aug 13 00:35:30.908464 sshd[5689]: Accepted publickey for core from 147.75.109.163 port 41700 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:35:30.913685 sshd-session[5689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:30.919505 systemd-logind[1593]: New session 10 of user core. Aug 13 00:35:30.923667 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 13 00:35:31.452943 sshd[5693]: Connection closed by 147.75.109.163 port 41700 Aug 13 00:35:31.453480 sshd-session[5689]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:31.459752 systemd[1]: sshd@7-139.178.70.101:22-147.75.109.163:41700.service: Deactivated successfully. Aug 13 00:35:31.462323 systemd[1]: session-10.scope: Deactivated successfully. Aug 13 00:35:31.464337 systemd-logind[1593]: Session 10 logged out. Waiting for processes to exit. Aug 13 00:35:31.465364 systemd-logind[1593]: Removed session 10. Aug 13 00:35:36.468626 systemd[1]: Started sshd@8-139.178.70.101:22-147.75.109.163:41706.service - OpenSSH per-connection server daemon (147.75.109.163:41706). Aug 13 00:35:36.928554 sshd[5714]: Accepted publickey for core from 147.75.109.163 port 41706 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:35:36.928623 sshd-session[5714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:36.934125 systemd-logind[1593]: New session 11 of user core. Aug 13 00:35:36.941764 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 13 00:35:37.190148 sshd[5716]: Connection closed by 147.75.109.163 port 41706 Aug 13 00:35:37.190691 sshd-session[5714]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:37.194241 systemd[1]: sshd@8-139.178.70.101:22-147.75.109.163:41706.service: Deactivated successfully. Aug 13 00:35:37.195750 systemd[1]: session-11.scope: Deactivated successfully. Aug 13 00:35:37.196451 systemd-logind[1593]: Session 11 logged out. Waiting for processes to exit. Aug 13 00:35:37.197405 systemd-logind[1593]: Removed session 11. Aug 13 00:35:42.219421 systemd[1]: Started sshd@9-139.178.70.101:22-147.75.109.163:49152.service - OpenSSH per-connection server daemon (147.75.109.163:49152). Aug 13 00:35:42.676330 sshd[5739]: Accepted publickey for core from 147.75.109.163 port 49152 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:35:42.683628 sshd-session[5739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:42.692565 systemd-logind[1593]: New session 12 of user core. Aug 13 00:35:42.696658 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 13 00:35:43.411523 sshd[5761]: Connection closed by 147.75.109.163 port 49152 Aug 13 00:35:43.412113 sshd-session[5739]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:43.420846 systemd[1]: sshd@9-139.178.70.101:22-147.75.109.163:49152.service: Deactivated successfully. Aug 13 00:35:43.422218 systemd[1]: session-12.scope: Deactivated successfully. Aug 13 00:35:43.424370 systemd-logind[1593]: Session 12 logged out. Waiting for processes to exit. Aug 13 00:35:43.426482 systemd[1]: Started sshd@10-139.178.70.101:22-147.75.109.163:49154.service - OpenSSH per-connection server daemon (147.75.109.163:49154). Aug 13 00:35:43.429130 systemd-logind[1593]: Removed session 12. Aug 13 00:35:43.473170 sshd[5779]: Accepted publickey for core from 147.75.109.163 port 49154 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:35:43.474021 sshd-session[5779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:43.476989 systemd-logind[1593]: New session 13 of user core. Aug 13 00:35:43.484704 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 13 00:35:43.739180 sshd[5781]: Connection closed by 147.75.109.163 port 49154 Aug 13 00:35:43.741618 sshd-session[5779]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:43.747404 systemd[1]: sshd@10-139.178.70.101:22-147.75.109.163:49154.service: Deactivated successfully. Aug 13 00:35:43.749397 systemd[1]: session-13.scope: Deactivated successfully. Aug 13 00:35:43.751805 systemd-logind[1593]: Session 13 logged out. Waiting for processes to exit. Aug 13 00:35:43.754335 systemd[1]: Started sshd@11-139.178.70.101:22-147.75.109.163:49162.service - OpenSSH per-connection server daemon (147.75.109.163:49162). Aug 13 00:35:43.758341 systemd-logind[1593]: Removed session 13. Aug 13 00:35:43.806674 sshd[5793]: Accepted publickey for core from 147.75.109.163 port 49162 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:35:43.807801 sshd-session[5793]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:43.811579 systemd-logind[1593]: New session 14 of user core. Aug 13 00:35:43.816673 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 13 00:35:43.964791 sshd[5795]: Connection closed by 147.75.109.163 port 49162 Aug 13 00:35:43.966479 sshd-session[5793]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:43.968824 systemd-logind[1593]: Session 14 logged out. Waiting for processes to exit. Aug 13 00:35:43.969562 systemd[1]: sshd@11-139.178.70.101:22-147.75.109.163:49162.service: Deactivated successfully. Aug 13 00:35:43.971886 systemd[1]: session-14.scope: Deactivated successfully. Aug 13 00:35:43.976580 systemd-logind[1593]: Removed session 14. Aug 13 00:35:44.089283 containerd[1618]: time="2025-08-13T00:35:44.089201065Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd\" id:\"eec72431f812466137ef6bcfc86d95e608c9d40a0813742eb1a29dd99b16f089\" pid:5754 exited_at:{seconds:1755045343 nanos:979811955}" Aug 13 00:35:48.719129 containerd[1618]: time="2025-08-13T00:35:48.705622267Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" id:\"75a6080992d7922e2a7b72f23bdc98334040ac83285ff0e6ab114315bff0c024\" pid:5818 exited_at:{seconds:1755045348 nanos:705202477}" Aug 13 00:35:48.981898 systemd[1]: Started sshd@12-139.178.70.101:22-147.75.109.163:56012.service - OpenSSH per-connection server daemon (147.75.109.163:56012). Aug 13 00:35:49.111463 sshd[5834]: Accepted publickey for core from 147.75.109.163 port 56012 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:35:49.126653 sshd-session[5834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:49.138654 systemd-logind[1593]: New session 15 of user core. Aug 13 00:35:49.143191 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 13 00:35:50.042021 sshd[5836]: Connection closed by 147.75.109.163 port 56012 Aug 13 00:35:50.052846 sshd-session[5834]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:50.072598 systemd[1]: sshd@12-139.178.70.101:22-147.75.109.163:56012.service: Deactivated successfully. Aug 13 00:35:50.073702 systemd-logind[1593]: Session 15 logged out. Waiting for processes to exit. Aug 13 00:35:50.074076 systemd[1]: session-15.scope: Deactivated successfully. Aug 13 00:35:50.076400 systemd-logind[1593]: Removed session 15. Aug 13 00:35:53.664985 containerd[1618]: time="2025-08-13T00:35:53.664934796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\" id:\"88ea6c91023ce6b0351f29fd2023a6572f1f5022ba5cbb94db5b73b784009195\" pid:5881 exited_at:{seconds:1755045353 nanos:664744215}" Aug 13 00:35:55.127576 systemd[1]: Started sshd@13-139.178.70.101:22-147.75.109.163:56020.service - OpenSSH per-connection server daemon (147.75.109.163:56020). Aug 13 00:35:55.438755 sshd[5899]: Accepted publickey for core from 147.75.109.163 port 56020 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:35:55.450814 sshd-session[5899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:55.461343 systemd-logind[1593]: New session 16 of user core. Aug 13 00:35:55.465664 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 13 00:35:57.255410 sshd[5901]: Connection closed by 147.75.109.163 port 56020 Aug 13 00:35:57.256796 sshd-session[5899]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:57.269748 systemd[1]: sshd@13-139.178.70.101:22-147.75.109.163:56020.service: Deactivated successfully. Aug 13 00:35:57.272824 systemd[1]: session-16.scope: Deactivated successfully. Aug 13 00:35:57.274327 systemd-logind[1593]: Session 16 logged out. Waiting for processes to exit. Aug 13 00:35:57.277096 systemd-logind[1593]: Removed session 16. Aug 13 00:36:02.307948 systemd[1]: Started sshd@14-139.178.70.101:22-147.75.109.163:57048.service - OpenSSH per-connection server daemon (147.75.109.163:57048). Aug 13 00:36:02.503928 sshd[5914]: Accepted publickey for core from 147.75.109.163 port 57048 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:36:02.512008 sshd-session[5914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:02.519524 systemd-logind[1593]: New session 17 of user core. Aug 13 00:36:02.527721 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 13 00:36:03.635281 sshd[5916]: Connection closed by 147.75.109.163 port 57048 Aug 13 00:36:03.635716 sshd-session[5914]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:03.643231 systemd[1]: sshd@14-139.178.70.101:22-147.75.109.163:57048.service: Deactivated successfully. Aug 13 00:36:03.645029 systemd[1]: session-17.scope: Deactivated successfully. Aug 13 00:36:03.645844 systemd-logind[1593]: Session 17 logged out. Waiting for processes to exit. Aug 13 00:36:03.647768 systemd-logind[1593]: Removed session 17. Aug 13 00:36:03.649377 systemd[1]: Started sshd@15-139.178.70.101:22-147.75.109.163:57050.service - OpenSSH per-connection server daemon (147.75.109.163:57050). Aug 13 00:36:03.783487 sshd[5928]: Accepted publickey for core from 147.75.109.163 port 57050 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:36:03.785178 sshd-session[5928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:03.793976 systemd-logind[1593]: New session 18 of user core. Aug 13 00:36:03.799704 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 13 00:36:05.172249 sshd[5930]: Connection closed by 147.75.109.163 port 57050 Aug 13 00:36:05.179744 sshd-session[5928]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:05.187383 systemd[1]: Started sshd@16-139.178.70.101:22-147.75.109.163:57052.service - OpenSSH per-connection server daemon (147.75.109.163:57052). Aug 13 00:36:05.192612 systemd[1]: sshd@15-139.178.70.101:22-147.75.109.163:57050.service: Deactivated successfully. Aug 13 00:36:05.194181 systemd[1]: session-18.scope: Deactivated successfully. Aug 13 00:36:05.195513 systemd-logind[1593]: Session 18 logged out. Waiting for processes to exit. Aug 13 00:36:05.196808 systemd-logind[1593]: Removed session 18. Aug 13 00:36:05.327808 sshd[5961]: Accepted publickey for core from 147.75.109.163 port 57052 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:36:05.328436 sshd-session[5961]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:05.350594 systemd-logind[1593]: New session 19 of user core. Aug 13 00:36:05.356574 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 13 00:36:05.419940 containerd[1618]: time="2025-08-13T00:36:05.419886246Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" id:\"136d9757cb4bb8fab755c2b0dcc9160abe68bc23c7277ac897cb6d2335de8b7a\" pid:5948 exited_at:{seconds:1755045365 nanos:280826161}" Aug 13 00:36:09.147787 sshd[5966]: Connection closed by 147.75.109.163 port 57052 Aug 13 00:36:09.145597 sshd-session[5961]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:09.200124 kubelet[2927]: E0813 00:36:09.163810 2927 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.12s" Aug 13 00:36:09.198055 systemd[1]: sshd@16-139.178.70.101:22-147.75.109.163:57052.service: Deactivated successfully. Aug 13 00:36:09.201016 systemd[1]: session-19.scope: Deactivated successfully. Aug 13 00:36:09.202075 systemd[1]: session-19.scope: Consumed 410ms CPU time, 78M memory peak. Aug 13 00:36:09.202716 systemd-logind[1593]: Session 19 logged out. Waiting for processes to exit. Aug 13 00:36:09.219320 systemd[1]: Started sshd@17-139.178.70.101:22-147.75.109.163:48024.service - OpenSSH per-connection server daemon (147.75.109.163:48024). Aug 13 00:36:09.221689 systemd-logind[1593]: Removed session 19. Aug 13 00:36:09.399214 sshd[5983]: Accepted publickey for core from 147.75.109.163 port 48024 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:36:09.404587 sshd-session[5983]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:09.411143 systemd-logind[1593]: New session 20 of user core. Aug 13 00:36:09.415783 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 13 00:36:11.462327 sshd[5986]: Connection closed by 147.75.109.163 port 48024 Aug 13 00:36:11.485398 sshd-session[5983]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:11.533787 systemd[1]: sshd@17-139.178.70.101:22-147.75.109.163:48024.service: Deactivated successfully. Aug 13 00:36:11.537287 systemd[1]: session-20.scope: Deactivated successfully. Aug 13 00:36:11.538617 systemd[1]: session-20.scope: Consumed 534ms CPU time, 65.9M memory peak. Aug 13 00:36:11.540402 systemd-logind[1593]: Session 20 logged out. Waiting for processes to exit. Aug 13 00:36:11.547027 systemd[1]: Started sshd@18-139.178.70.101:22-147.75.109.163:48040.service - OpenSSH per-connection server daemon (147.75.109.163:48040). Aug 13 00:36:11.552567 systemd-logind[1593]: Removed session 20. Aug 13 00:36:11.672137 sshd[6010]: Accepted publickey for core from 147.75.109.163 port 48040 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:36:11.674820 sshd-session[6010]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:11.683532 systemd-logind[1593]: New session 21 of user core. Aug 13 00:36:11.688356 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 13 00:36:12.923865 sshd[6012]: Connection closed by 147.75.109.163 port 48040 Aug 13 00:36:12.926705 systemd-logind[1593]: Session 21 logged out. Waiting for processes to exit. Aug 13 00:36:12.924054 sshd-session[6010]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:12.927075 systemd[1]: sshd@18-139.178.70.101:22-147.75.109.163:48040.service: Deactivated successfully. Aug 13 00:36:12.928156 systemd[1]: session-21.scope: Deactivated successfully. Aug 13 00:36:12.928931 systemd-logind[1593]: Removed session 21. Aug 13 00:36:14.141407 containerd[1618]: time="2025-08-13T00:36:14.121648702Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3be9515d6f0992f8c89c0aa81eca3b0c2818b85aa4a11fd5a338f83a8f2000dd\" id:\"ca66ae4cba4ea61aeddb80b1c1b60e5c5326f2dad9eacbb4f0176cdd97013d07\" pid:6037 exited_at:{seconds:1755045373 nanos:993241653}" Aug 13 00:36:17.086972 containerd[1618]: time="2025-08-13T00:36:17.086934932Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1255f5f8dbd399bff1c57cb4cd5754e1e5610efd93218b30c9d657ccc201ea7\" id:\"c2ba030a35cad8487b6f6fbb022dd7bad8a9745a6a83e14a296975f0781425fa\" pid:6075 exited_at:{seconds:1755045377 nanos:86461502}" Aug 13 00:36:18.021221 systemd[1]: Started sshd@19-139.178.70.101:22-147.75.109.163:48046.service - OpenSSH per-connection server daemon (147.75.109.163:48046). Aug 13 00:36:18.144483 sshd[6086]: Accepted publickey for core from 147.75.109.163 port 48046 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:36:18.146449 sshd-session[6086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:18.152437 systemd-logind[1593]: New session 22 of user core. Aug 13 00:36:18.157667 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 13 00:36:19.039283 sshd[6088]: Connection closed by 147.75.109.163 port 48046 Aug 13 00:36:19.039592 sshd-session[6086]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:19.044117 systemd-logind[1593]: Session 22 logged out. Waiting for processes to exit. Aug 13 00:36:19.044869 systemd[1]: sshd@19-139.178.70.101:22-147.75.109.163:48046.service: Deactivated successfully. Aug 13 00:36:19.047432 systemd[1]: session-22.scope: Deactivated successfully. Aug 13 00:36:19.049176 systemd-logind[1593]: Removed session 22. Aug 13 00:36:19.894744 containerd[1618]: time="2025-08-13T00:36:19.894587757Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\" id:\"6d2a906479906dea9e614775d910af4981834a110fb9b91afd6bb693ab97324f\" pid:6111 exited_at:{seconds:1755045379 nanos:884763058}" Aug 13 00:36:23.390317 containerd[1618]: time="2025-08-13T00:36:23.390162753Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b45ac71c541ceb9d876d5c6634c191af64f19092b65f586e899add26c7b1593\" id:\"00baa09c53f51cfc182cf44378384acc5f5969f2734796dc1de6254c688d25b4\" pid:6133 exited_at:{seconds:1755045383 nanos:389790164}" Aug 13 00:36:24.059475 systemd[1]: Started sshd@20-139.178.70.101:22-147.75.109.163:55160.service - OpenSSH per-connection server daemon (147.75.109.163:55160). Aug 13 00:36:24.656598 sshd[6145]: Accepted publickey for core from 147.75.109.163 port 55160 ssh2: RSA SHA256:jzGgzAS12JlpfyHxhL4P6PxpFK6cLXmqYnYR6euCwx0 Aug 13 00:36:24.660650 sshd-session[6145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:24.673487 systemd-logind[1593]: New session 23 of user core. Aug 13 00:36:24.683183 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 13 00:36:25.691725 sshd[6147]: Connection closed by 147.75.109.163 port 55160 Aug 13 00:36:25.692086 sshd-session[6145]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:25.694700 systemd[1]: sshd@20-139.178.70.101:22-147.75.109.163:55160.service: Deactivated successfully. Aug 13 00:36:25.696207 systemd[1]: session-23.scope: Deactivated successfully. Aug 13 00:36:25.701473 systemd-logind[1593]: Session 23 logged out. Waiting for processes to exit. Aug 13 00:36:25.702679 systemd-logind[1593]: Removed session 23.